Dec 10 09:29:47 localhost kernel: Linux version 5.14.0-648.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Dec 5 11:18:23 UTC 2025
Dec 10 09:29:47 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 10 09:29:47 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 10 09:29:47 localhost kernel: BIOS-provided physical RAM map:
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 10 09:29:47 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 10 09:29:47 localhost kernel: NX (Execute Disable) protection: active
Dec 10 09:29:47 localhost kernel: APIC: Static calls initialized
Dec 10 09:29:47 localhost kernel: SMBIOS 2.8 present.
Dec 10 09:29:47 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 10 09:29:47 localhost kernel: Hypervisor detected: KVM
Dec 10 09:29:47 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 10 09:29:47 localhost kernel: kvm-clock: using sched offset of 3264607217 cycles
Dec 10 09:29:47 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 10 09:29:47 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 10 09:29:47 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 10 09:29:47 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 10 09:29:47 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 10 09:29:47 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 10 09:29:47 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 10 09:29:47 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 10 09:29:47 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 10 09:29:47 localhost kernel: Using GB pages for direct mapping
Dec 10 09:29:47 localhost kernel: RAMDISK: [mem 0x2d46a000-0x32a2cfff]
Dec 10 09:29:47 localhost kernel: ACPI: Early table checksum verification disabled
Dec 10 09:29:47 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 10 09:29:47 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 10 09:29:47 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 10 09:29:47 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 10 09:29:47 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 10 09:29:47 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 10 09:29:47 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 10 09:29:47 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 10 09:29:47 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 10 09:29:47 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 10 09:29:47 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 10 09:29:47 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 10 09:29:47 localhost kernel: No NUMA configuration found
Dec 10 09:29:47 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 10 09:29:47 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 10 09:29:47 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 10 09:29:47 localhost kernel: Zone ranges:
Dec 10 09:29:47 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 10 09:29:47 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 10 09:29:47 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 10 09:29:47 localhost kernel:   Device   empty
Dec 10 09:29:47 localhost kernel: Movable zone start for each node
Dec 10 09:29:47 localhost kernel: Early memory node ranges
Dec 10 09:29:47 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 10 09:29:47 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 10 09:29:47 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 10 09:29:47 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 10 09:29:47 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 10 09:29:47 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 10 09:29:47 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 10 09:29:47 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 10 09:29:47 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 10 09:29:47 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 10 09:29:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 10 09:29:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 10 09:29:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 10 09:29:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 10 09:29:47 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 10 09:29:47 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 10 09:29:47 localhost kernel: TSC deadline timer available
Dec 10 09:29:47 localhost kernel: CPU topo: Max. logical packages:   8
Dec 10 09:29:47 localhost kernel: CPU topo: Max. logical dies:       8
Dec 10 09:29:47 localhost kernel: CPU topo: Max. dies per package:   1
Dec 10 09:29:47 localhost kernel: CPU topo: Max. threads per core:   1
Dec 10 09:29:47 localhost kernel: CPU topo: Num. cores per package:     1
Dec 10 09:29:47 localhost kernel: CPU topo: Num. threads per package:   1
Dec 10 09:29:47 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 10 09:29:47 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 10 09:29:47 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 10 09:29:47 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 10 09:29:47 localhost kernel: Booting paravirtualized kernel on KVM
Dec 10 09:29:47 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 10 09:29:47 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 10 09:29:47 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 10 09:29:47 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 10 09:29:47 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 10 09:29:47 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 10 09:29:47 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 10 09:29:47 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64", will be passed to user space.
Dec 10 09:29:47 localhost kernel: random: crng init done
Dec 10 09:29:47 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 10 09:29:47 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 10 09:29:47 localhost kernel: Fallback order for Node 0: 0 
Dec 10 09:29:47 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 10 09:29:47 localhost kernel: Policy zone: Normal
Dec 10 09:29:47 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 10 09:29:47 localhost kernel: software IO TLB: area num 8.
Dec 10 09:29:47 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 10 09:29:47 localhost kernel: ftrace: allocating 49357 entries in 193 pages
Dec 10 09:29:47 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 10 09:29:47 localhost kernel: Dynamic Preempt: voluntary
Dec 10 09:29:47 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 10 09:29:47 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 10 09:29:47 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 10 09:29:47 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 10 09:29:47 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 10 09:29:47 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 10 09:29:47 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 10 09:29:47 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 10 09:29:47 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 10 09:29:47 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 10 09:29:47 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 10 09:29:47 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 10 09:29:47 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 10 09:29:47 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 10 09:29:47 localhost kernel: Console: colour VGA+ 80x25
Dec 10 09:29:47 localhost kernel: printk: console [ttyS0] enabled
Dec 10 09:29:47 localhost kernel: ACPI: Core revision 20230331
Dec 10 09:29:47 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 10 09:29:47 localhost kernel: x2apic enabled
Dec 10 09:29:47 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 10 09:29:47 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 10 09:29:47 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 10 09:29:47 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 10 09:29:47 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 10 09:29:47 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 10 09:29:47 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 10 09:29:47 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 10 09:29:47 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 10 09:29:47 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 10 09:29:47 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 10 09:29:47 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 10 09:29:47 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 10 09:29:47 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 10 09:29:47 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 10 09:29:47 localhost kernel: x86/bugs: return thunk changed
Dec 10 09:29:47 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 10 09:29:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 10 09:29:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 10 09:29:47 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 10 09:29:47 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 10 09:29:47 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 10 09:29:47 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 10 09:29:47 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 10 09:29:47 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 10 09:29:47 localhost kernel: landlock: Up and running.
Dec 10 09:29:47 localhost kernel: Yama: becoming mindful.
Dec 10 09:29:47 localhost kernel: SELinux:  Initializing.
Dec 10 09:29:47 localhost kernel: LSM support for eBPF active
Dec 10 09:29:47 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 10 09:29:47 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 10 09:29:47 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 10 09:29:47 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 10 09:29:47 localhost kernel: ... version:                0
Dec 10 09:29:47 localhost kernel: ... bit width:              48
Dec 10 09:29:47 localhost kernel: ... generic registers:      6
Dec 10 09:29:47 localhost kernel: ... value mask:             0000ffffffffffff
Dec 10 09:29:47 localhost kernel: ... max period:             00007fffffffffff
Dec 10 09:29:47 localhost kernel: ... fixed-purpose events:   0
Dec 10 09:29:47 localhost kernel: ... event mask:             000000000000003f
Dec 10 09:29:47 localhost kernel: signal: max sigframe size: 1776
Dec 10 09:29:47 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 10 09:29:47 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 10 09:29:47 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 10 09:29:47 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 10 09:29:47 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 10 09:29:47 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 10 09:29:47 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 10 09:29:47 localhost kernel: node 0 deferred pages initialised in 8ms
Dec 10 09:29:47 localhost kernel: Memory: 7763984K/8388068K available (16384K kernel code, 5795K rwdata, 13916K rodata, 4192K init, 7164K bss, 618228K reserved, 0K cma-reserved)
Dec 10 09:29:47 localhost kernel: devtmpfs: initialized
Dec 10 09:29:47 localhost kernel: x86/mm: Memory block size: 128MB
Dec 10 09:29:47 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 10 09:29:47 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 10 09:29:47 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 10 09:29:47 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 10 09:29:47 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 10 09:29:47 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 10 09:29:47 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 10 09:29:47 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 10 09:29:47 localhost kernel: audit: type=2000 audit(1765358985.042:1): state=initialized audit_enabled=0 res=1
Dec 10 09:29:47 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 10 09:29:47 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 10 09:29:47 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 10 09:29:47 localhost kernel: cpuidle: using governor menu
Dec 10 09:29:47 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 10 09:29:47 localhost kernel: PCI: Using configuration type 1 for base access
Dec 10 09:29:47 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 10 09:29:47 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 10 09:29:47 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 10 09:29:47 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 10 09:29:47 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 10 09:29:47 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 10 09:29:47 localhost kernel: Demotion targets for Node 0: null
Dec 10 09:29:47 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 10 09:29:47 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 10 09:29:47 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 10 09:29:47 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 10 09:29:47 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 10 09:29:47 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 10 09:29:47 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 10 09:29:47 localhost kernel: ACPI: Interpreter enabled
Dec 10 09:29:47 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 10 09:29:47 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 10 09:29:47 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 10 09:29:47 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 10 09:29:47 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 10 09:29:47 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 10 09:29:47 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [3] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [4] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [5] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [6] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [7] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [8] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [9] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [10] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [11] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [12] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [13] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [14] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [15] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [16] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [17] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [18] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [19] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [20] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [21] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [22] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [23] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [24] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [25] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [26] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [27] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [28] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [29] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [30] registered
Dec 10 09:29:47 localhost kernel: acpiphp: Slot [31] registered
Dec 10 09:29:47 localhost kernel: PCI host bridge to bus 0000:00
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 10 09:29:47 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 10 09:29:47 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 10 09:29:47 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 10 09:29:47 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 10 09:29:47 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 10 09:29:47 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 10 09:29:47 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 10 09:29:47 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 10 09:29:47 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 10 09:29:47 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 10 09:29:47 localhost kernel: iommu: Default domain type: Translated
Dec 10 09:29:47 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 10 09:29:47 localhost kernel: SCSI subsystem initialized
Dec 10 09:29:47 localhost kernel: ACPI: bus type USB registered
Dec 10 09:29:47 localhost kernel: usbcore: registered new interface driver usbfs
Dec 10 09:29:47 localhost kernel: usbcore: registered new interface driver hub
Dec 10 09:29:47 localhost kernel: usbcore: registered new device driver usb
Dec 10 09:29:47 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 10 09:29:47 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 10 09:29:47 localhost kernel: PTP clock support registered
Dec 10 09:29:47 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 10 09:29:47 localhost kernel: NetLabel: Initializing
Dec 10 09:29:47 localhost kernel: NetLabel:  domain hash size = 128
Dec 10 09:29:47 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 10 09:29:47 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 10 09:29:47 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 10 09:29:47 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 10 09:29:47 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 10 09:29:47 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 10 09:29:47 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 10 09:29:47 localhost kernel: vgaarb: loaded
Dec 10 09:29:47 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 10 09:29:47 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 10 09:29:47 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 10 09:29:47 localhost kernel: pnp: PnP ACPI init
Dec 10 09:29:47 localhost kernel: pnp 00:03: [dma 2]
Dec 10 09:29:47 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 10 09:29:47 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 10 09:29:47 localhost kernel: NET: Registered PF_INET protocol family
Dec 10 09:29:47 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 10 09:29:47 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 10 09:29:47 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 10 09:29:47 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 10 09:29:47 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 10 09:29:47 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 10 09:29:47 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 10 09:29:47 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 10 09:29:47 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 10 09:29:47 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 10 09:29:47 localhost kernel: NET: Registered PF_XDP protocol family
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 10 09:29:47 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 10 09:29:47 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 10 09:29:47 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 10 09:29:47 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 73356 usecs
Dec 10 09:29:47 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 10 09:29:47 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 10 09:29:47 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 10 09:29:47 localhost kernel: ACPI: bus type thunderbolt registered
Dec 10 09:29:47 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 10 09:29:47 localhost kernel: Initialise system trusted keyrings
Dec 10 09:29:47 localhost kernel: Key type blacklist registered
Dec 10 09:29:47 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 10 09:29:47 localhost kernel: zbud: loaded
Dec 10 09:29:47 localhost kernel: integrity: Platform Keyring initialized
Dec 10 09:29:47 localhost kernel: integrity: Machine keyring initialized
Dec 10 09:29:47 localhost kernel: Freeing initrd memory: 87820K
Dec 10 09:29:47 localhost kernel: NET: Registered PF_ALG protocol family
Dec 10 09:29:47 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 10 09:29:47 localhost kernel: Key type asymmetric registered
Dec 10 09:29:47 localhost kernel: Asymmetric key parser 'x509' registered
Dec 10 09:29:47 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 10 09:29:47 localhost kernel: io scheduler mq-deadline registered
Dec 10 09:29:47 localhost kernel: io scheduler kyber registered
Dec 10 09:29:47 localhost kernel: io scheduler bfq registered
Dec 10 09:29:47 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 10 09:29:47 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 10 09:29:47 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 10 09:29:47 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 10 09:29:47 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 10 09:29:47 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 10 09:29:47 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 10 09:29:47 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 10 09:29:47 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 10 09:29:47 localhost kernel: Non-volatile memory driver v1.3
Dec 10 09:29:47 localhost kernel: rdac: device handler registered
Dec 10 09:29:47 localhost kernel: hp_sw: device handler registered
Dec 10 09:29:47 localhost kernel: emc: device handler registered
Dec 10 09:29:47 localhost kernel: alua: device handler registered
Dec 10 09:29:47 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 10 09:29:47 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 10 09:29:47 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 10 09:29:47 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 10 09:29:47 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 10 09:29:47 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 10 09:29:47 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 10 09:29:47 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-648.el9.x86_64 uhci_hcd
Dec 10 09:29:47 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 10 09:29:47 localhost kernel: hub 1-0:1.0: USB hub found
Dec 10 09:29:47 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 10 09:29:47 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 10 09:29:47 localhost kernel: usbserial: USB Serial support registered for generic
Dec 10 09:29:47 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 10 09:29:47 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 10 09:29:47 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 10 09:29:47 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 10 09:29:47 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 10 09:29:47 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 10 09:29:47 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-10T09:29:46 UTC (1765358986)
Dec 10 09:29:47 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 10 09:29:47 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 10 09:29:47 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 10 09:29:47 localhost kernel: usbcore: registered new interface driver usbhid
Dec 10 09:29:47 localhost kernel: usbhid: USB HID core driver
Dec 10 09:29:47 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 10 09:29:47 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 10 09:29:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 10 09:29:47 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 10 09:29:47 localhost kernel: Initializing XFRM netlink socket
Dec 10 09:29:47 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 10 09:29:47 localhost kernel: Segment Routing with IPv6
Dec 10 09:29:47 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 10 09:29:47 localhost kernel: mpls_gso: MPLS GSO support
Dec 10 09:29:47 localhost kernel: IPI shorthand broadcast: enabled
Dec 10 09:29:47 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 10 09:29:47 localhost kernel: AES CTR mode by8 optimization enabled
Dec 10 09:29:47 localhost kernel: sched_clock: Marking stable (1811009935, 149371396)->(2039923657, -79542326)
Dec 10 09:29:47 localhost kernel: registered taskstats version 1
Dec 10 09:29:47 localhost kernel: Loading compiled-in X.509 certificates
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 10 09:29:47 localhost kernel: Demotion targets for Node 0: null
Dec 10 09:29:47 localhost kernel: page_owner is disabled
Dec 10 09:29:47 localhost kernel: Key type .fscrypt registered
Dec 10 09:29:47 localhost kernel: Key type fscrypt-provisioning registered
Dec 10 09:29:47 localhost kernel: Key type big_key registered
Dec 10 09:29:47 localhost kernel: Key type encrypted registered
Dec 10 09:29:47 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 10 09:29:47 localhost kernel: Loading compiled-in module X.509 certificates
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 10 09:29:47 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 10 09:29:47 localhost kernel: ima: No architecture policies found
Dec 10 09:29:47 localhost kernel: evm: Initialising EVM extended attributes:
Dec 10 09:29:47 localhost kernel: evm: security.selinux
Dec 10 09:29:47 localhost kernel: evm: security.SMACK64 (disabled)
Dec 10 09:29:47 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 10 09:29:47 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 10 09:29:47 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 10 09:29:47 localhost kernel: evm: security.apparmor (disabled)
Dec 10 09:29:47 localhost kernel: evm: security.ima
Dec 10 09:29:47 localhost kernel: evm: security.capability
Dec 10 09:29:47 localhost kernel: evm: HMAC attrs: 0x1
Dec 10 09:29:47 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 10 09:29:47 localhost kernel: Running certificate verification RSA selftest
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 10 09:29:47 localhost kernel: Running certificate verification ECDSA selftest
Dec 10 09:29:47 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 10 09:29:47 localhost kernel: clk: Disabling unused clocks
Dec 10 09:29:47 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 10 09:29:47 localhost kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec 10 09:29:47 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 10 09:29:47 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Dec 10 09:29:47 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 10 09:29:47 localhost kernel: Run /init as init process
Dec 10 09:29:47 localhost kernel:   with arguments:
Dec 10 09:29:47 localhost kernel:     /init
Dec 10 09:29:47 localhost kernel:   with environment:
Dec 10 09:29:47 localhost kernel:     HOME=/
Dec 10 09:29:47 localhost kernel:     TERM=linux
Dec 10 09:29:47 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64
Dec 10 09:29:47 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 10 09:29:47 localhost systemd[1]: Detected virtualization kvm.
Dec 10 09:29:47 localhost systemd[1]: Detected architecture x86-64.
Dec 10 09:29:47 localhost systemd[1]: Running in initrd.
Dec 10 09:29:47 localhost systemd[1]: No hostname configured, using default hostname.
Dec 10 09:29:47 localhost systemd[1]: Hostname set to <localhost>.
Dec 10 09:29:47 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 10 09:29:47 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 10 09:29:47 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 10 09:29:47 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 10 09:29:47 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 10 09:29:47 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 10 09:29:47 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 10 09:29:47 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 10 09:29:47 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 10 09:29:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 10 09:29:47 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 10 09:29:47 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 10 09:29:47 localhost systemd[1]: Reached target Local File Systems.
Dec 10 09:29:47 localhost systemd[1]: Reached target Path Units.
Dec 10 09:29:47 localhost systemd[1]: Reached target Slice Units.
Dec 10 09:29:47 localhost systemd[1]: Reached target Swaps.
Dec 10 09:29:47 localhost systemd[1]: Reached target Timer Units.
Dec 10 09:29:47 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 10 09:29:47 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 10 09:29:47 localhost systemd[1]: Listening on Journal Socket.
Dec 10 09:29:47 localhost systemd[1]: Listening on udev Control Socket.
Dec 10 09:29:47 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 10 09:29:47 localhost systemd[1]: Reached target Socket Units.
Dec 10 09:29:47 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 10 09:29:47 localhost systemd[1]: Starting Journal Service...
Dec 10 09:29:47 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 10 09:29:47 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 10 09:29:47 localhost systemd[1]: Starting Create System Users...
Dec 10 09:29:47 localhost systemd[1]: Starting Setup Virtual Console...
Dec 10 09:29:47 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 10 09:29:47 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 10 09:29:47 localhost systemd-journald[303]: Journal started
Dec 10 09:29:47 localhost systemd-journald[303]: Runtime Journal (/run/log/journal/4f9a932ed23a4638b69b16fdca20f7f4) is 8.0M, max 153.6M, 145.6M free.
Dec 10 09:29:47 localhost systemd[1]: Finished Create System Users.
Dec 10 09:29:47 localhost systemd-sysusers[307]: Creating group 'users' with GID 100.
Dec 10 09:29:47 localhost systemd-sysusers[307]: Creating group 'dbus' with GID 81.
Dec 10 09:29:47 localhost systemd-sysusers[307]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 10 09:29:47 localhost systemd[1]: Started Journal Service.
Dec 10 09:29:47 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 10 09:29:47 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 10 09:29:47 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 10 09:29:47 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 10 09:29:47 localhost systemd[1]: Finished Setup Virtual Console.
Dec 10 09:29:47 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 10 09:29:47 localhost systemd[1]: Starting dracut cmdline hook...
Dec 10 09:29:47 localhost dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Dec 10 09:29:47 localhost dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 10 09:29:47 localhost systemd[1]: Finished dracut cmdline hook.
Dec 10 09:29:47 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 10 09:29:47 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 10 09:29:47 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 10 09:29:47 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 10 09:29:47 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 10 09:29:47 localhost kernel: RPC: Registered udp transport module.
Dec 10 09:29:47 localhost kernel: RPC: Registered tcp transport module.
Dec 10 09:29:47 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 10 09:29:47 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 10 09:29:47 localhost rpc.statd[438]: Version 2.5.4 starting
Dec 10 09:29:47 localhost rpc.statd[438]: Initializing NSM state
Dec 10 09:29:47 localhost rpc.idmapd[443]: Setting log level to 0
Dec 10 09:29:47 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 10 09:29:47 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 10 09:29:47 localhost systemd-udevd[456]: Using default interface naming scheme 'rhel-9.0'.
Dec 10 09:29:47 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 10 09:29:47 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 10 09:29:47 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 10 09:29:47 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 10 09:29:48 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 10 09:29:48 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 10 09:29:48 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 10 09:29:48 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 10 09:29:48 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 10 09:29:48 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 10 09:29:48 localhost systemd[1]: Reached target Network.
Dec 10 09:29:48 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 10 09:29:48 localhost systemd[1]: Starting dracut initqueue hook...
Dec 10 09:29:48 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 10 09:29:48 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 10 09:29:48 localhost systemd[1]: Reached target System Initialization.
Dec 10 09:29:48 localhost systemd[1]: Reached target Basic System.
Dec 10 09:29:48 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 10 09:29:48 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 10 09:29:48 localhost kernel:  vda: vda1
Dec 10 09:29:48 localhost kernel: libata version 3.00 loaded.
Dec 10 09:29:48 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 10 09:29:48 localhost systemd[1]: Found device /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 10 09:29:48 localhost kernel: scsi host0: ata_piix
Dec 10 09:29:48 localhost kernel: scsi host1: ata_piix
Dec 10 09:29:48 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 10 09:29:48 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 10 09:29:48 localhost systemd[1]: Reached target Initrd Root Device.
Dec 10 09:29:48 localhost kernel: ata1: found unknown device (class 0)
Dec 10 09:29:48 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 10 09:29:48 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 10 09:29:48 localhost systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 09:29:48 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 10 09:29:48 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 10 09:29:48 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 10 09:29:48 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 10 09:29:48 localhost systemd[1]: Finished dracut initqueue hook.
Dec 10 09:29:48 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 10 09:29:48 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 10 09:29:48 localhost systemd[1]: Reached target Remote File Systems.
Dec 10 09:29:48 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 10 09:29:48 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 10 09:29:48 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266...
Dec 10 09:29:48 localhost systemd-fsck[552]: /usr/sbin/fsck.xfs: XFS file system.
Dec 10 09:29:48 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 10 09:29:48 localhost systemd[1]: Mounting /sysroot...
Dec 10 09:29:48 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 10 09:29:48 localhost kernel: XFS (vda1): Mounting V5 Filesystem cbdedf45-ed1d-4952-82a8-33a12c0ba266
Dec 10 09:29:49 localhost kernel: XFS (vda1): Ending clean mount
Dec 10 09:29:49 localhost systemd[1]: Mounted /sysroot.
Dec 10 09:29:49 localhost systemd[1]: Reached target Initrd Root File System.
Dec 10 09:29:49 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 10 09:29:49 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 10 09:29:49 localhost systemd[1]: Reached target Initrd File Systems.
Dec 10 09:29:49 localhost systemd[1]: Reached target Initrd Default Target.
Dec 10 09:29:49 localhost systemd[1]: Starting dracut mount hook...
Dec 10 09:29:49 localhost systemd[1]: Finished dracut mount hook.
Dec 10 09:29:49 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 10 09:29:49 localhost rpc.idmapd[443]: exiting on signal 15
Dec 10 09:29:49 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 10 09:29:49 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 10 09:29:49 localhost systemd[1]: Stopped target Network.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Timer Units.
Dec 10 09:29:49 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 10 09:29:49 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Basic System.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Path Units.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Remote File Systems.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Slice Units.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Socket Units.
Dec 10 09:29:49 localhost systemd[1]: Stopped target System Initialization.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Local File Systems.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Swaps.
Dec 10 09:29:49 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut mount hook.
Dec 10 09:29:49 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 10 09:29:49 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 10 09:29:49 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 10 09:29:49 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 10 09:29:49 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 10 09:29:49 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 10 09:29:49 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 10 09:29:49 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 10 09:29:49 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 10 09:29:49 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 10 09:29:49 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 10 09:29:49 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 10 09:29:49 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Closed udev Control Socket.
Dec 10 09:29:49 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Closed udev Kernel Socket.
Dec 10 09:29:49 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 10 09:29:49 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 10 09:29:49 localhost systemd[1]: Starting Cleanup udev Database...
Dec 10 09:29:49 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 10 09:29:49 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 10 09:29:49 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Stopped Create System Users.
Dec 10 09:29:49 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 10 09:29:49 localhost systemd[1]: Finished Cleanup udev Database.
Dec 10 09:29:49 localhost systemd[1]: Reached target Switch Root.
Dec 10 09:29:49 localhost systemd[1]: Starting Switch Root...
Dec 10 09:29:49 localhost systemd[1]: Switching root.
Dec 10 09:29:49 localhost systemd-journald[303]: Journal stopped
Dec 10 09:29:50 localhost systemd-journald[303]: Received SIGTERM from PID 1 (systemd).
Dec 10 09:29:50 localhost kernel: audit: type=1404 audit(1765358989.414:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability open_perms=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:29:50 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:29:50 localhost kernel: audit: type=1403 audit(1765358989.537:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 10 09:29:50 localhost systemd[1]: Successfully loaded SELinux policy in 124.990ms.
Dec 10 09:29:50 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 34.594ms.
Dec 10 09:29:50 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 10 09:29:50 localhost systemd[1]: Detected virtualization kvm.
Dec 10 09:29:50 localhost systemd[1]: Detected architecture x86-64.
Dec 10 09:29:50 localhost systemd-rc-local-generator[635]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:29:50 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Stopped Switch Root.
Dec 10 09:29:50 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 10 09:29:50 localhost systemd[1]: Created slice Slice /system/getty.
Dec 10 09:29:50 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 10 09:29:50 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 10 09:29:50 localhost systemd[1]: Created slice User and Session Slice.
Dec 10 09:29:50 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 10 09:29:50 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 10 09:29:50 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 10 09:29:50 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 10 09:29:50 localhost systemd[1]: Stopped target Switch Root.
Dec 10 09:29:50 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 10 09:29:50 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 10 09:29:50 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 10 09:29:50 localhost systemd[1]: Reached target Path Units.
Dec 10 09:29:50 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 10 09:29:50 localhost systemd[1]: Reached target Slice Units.
Dec 10 09:29:50 localhost systemd[1]: Reached target Swaps.
Dec 10 09:29:50 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 10 09:29:50 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 10 09:29:50 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 10 09:29:50 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 10 09:29:50 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 10 09:29:50 localhost systemd[1]: Listening on udev Control Socket.
Dec 10 09:29:50 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 10 09:29:50 localhost systemd[1]: Mounting Huge Pages File System...
Dec 10 09:29:50 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 10 09:29:50 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 10 09:29:50 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 10 09:29:50 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 10 09:29:50 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 10 09:29:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 10 09:29:50 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 10 09:29:50 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 10 09:29:50 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 10 09:29:50 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 10 09:29:50 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 10 09:29:50 localhost systemd[1]: Stopped Journal Service.
Dec 10 09:29:50 localhost systemd[1]: Starting Journal Service...
Dec 10 09:29:50 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 10 09:29:50 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 10 09:29:50 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 10 09:29:50 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 10 09:29:50 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 10 09:29:50 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 10 09:29:50 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 10 09:29:50 localhost kernel: fuse: init (API version 7.37)
Dec 10 09:29:50 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 10 09:29:50 localhost systemd[1]: Mounted Huge Pages File System.
Dec 10 09:29:50 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 10 09:29:50 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 10 09:29:50 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 10 09:29:50 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 10 09:29:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 10 09:29:50 localhost systemd-journald[676]: Journal started
Dec 10 09:29:50 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 10 09:29:49 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 10 09:29:49 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Started Journal Service.
Dec 10 09:29:50 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 10 09:29:50 localhost kernel: ACPI: bus type drm_connector registered
Dec 10 09:29:50 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 10 09:29:50 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 10 09:29:50 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 10 09:29:50 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 10 09:29:50 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 10 09:29:50 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 10 09:29:50 localhost systemd[1]: Mounting FUSE Control File System...
Dec 10 09:29:50 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 10 09:29:50 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 10 09:29:50 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 10 09:29:50 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 10 09:29:50 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 10 09:29:50 localhost systemd[1]: Starting Create System Users...
Dec 10 09:29:50 localhost systemd[1]: Mounted FUSE Control File System.
Dec 10 09:29:50 localhost systemd-journald[676]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 10 09:29:50 localhost systemd-journald[676]: Received client request to flush runtime journal.
Dec 10 09:29:50 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 10 09:29:50 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 10 09:29:50 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 10 09:29:50 localhost systemd[1]: Finished Create System Users.
Dec 10 09:29:50 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 10 09:29:50 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 10 09:29:50 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 10 09:29:50 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 10 09:29:50 localhost systemd[1]: Reached target Local File Systems.
Dec 10 09:29:50 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 10 09:29:50 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 10 09:29:50 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 10 09:29:50 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 10 09:29:50 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 10 09:29:50 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 10 09:29:50 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 10 09:29:50 localhost bootctl[695]: Couldn't find EFI system partition, skipping.
Dec 10 09:29:50 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 10 09:29:50 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 10 09:29:50 localhost systemd[1]: Starting Security Auditing Service...
Dec 10 09:29:50 localhost systemd[1]: Starting RPC Bind...
Dec 10 09:29:50 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 10 09:29:50 localhost auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 10 09:29:50 localhost auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 10 09:29:50 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 10 09:29:50 localhost systemd[1]: Started RPC Bind.
Dec 10 09:29:50 localhost augenrules[706]: /sbin/augenrules: No change
Dec 10 09:29:50 localhost augenrules[722]: No rules
Dec 10 09:29:50 localhost augenrules[722]: enabled 1
Dec 10 09:29:50 localhost augenrules[722]: failure 1
Dec 10 09:29:50 localhost augenrules[722]: pid 701
Dec 10 09:29:50 localhost augenrules[722]: rate_limit 0
Dec 10 09:29:50 localhost augenrules[722]: backlog_limit 8192
Dec 10 09:29:50 localhost augenrules[722]: lost 0
Dec 10 09:29:50 localhost augenrules[722]: backlog 3
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time 60000
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time_actual 0
Dec 10 09:29:50 localhost augenrules[722]: enabled 1
Dec 10 09:29:50 localhost augenrules[722]: failure 1
Dec 10 09:29:50 localhost augenrules[722]: pid 701
Dec 10 09:29:50 localhost augenrules[722]: rate_limit 0
Dec 10 09:29:50 localhost augenrules[722]: backlog_limit 8192
Dec 10 09:29:50 localhost augenrules[722]: lost 0
Dec 10 09:29:50 localhost augenrules[722]: backlog 4
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time 60000
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time_actual 0
Dec 10 09:29:50 localhost augenrules[722]: enabled 1
Dec 10 09:29:50 localhost augenrules[722]: failure 1
Dec 10 09:29:50 localhost augenrules[722]: pid 701
Dec 10 09:29:50 localhost augenrules[722]: rate_limit 0
Dec 10 09:29:50 localhost augenrules[722]: backlog_limit 8192
Dec 10 09:29:50 localhost augenrules[722]: lost 0
Dec 10 09:29:50 localhost augenrules[722]: backlog 4
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time 60000
Dec 10 09:29:50 localhost augenrules[722]: backlog_wait_time_actual 0
Dec 10 09:29:50 localhost systemd[1]: Started Security Auditing Service.
Dec 10 09:29:50 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 10 09:29:50 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 10 09:29:50 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 10 09:29:50 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 10 09:29:50 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 10 09:29:50 localhost systemd[1]: Starting Update is Completed...
Dec 10 09:29:50 localhost systemd[1]: Finished Update is Completed.
Dec 10 09:29:50 localhost systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Dec 10 09:29:50 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 10 09:29:50 localhost systemd[1]: Reached target System Initialization.
Dec 10 09:29:50 localhost systemd[1]: Started dnf makecache --timer.
Dec 10 09:29:50 localhost systemd[1]: Started Daily rotation of log files.
Dec 10 09:29:50 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 10 09:29:50 localhost systemd[1]: Reached target Timer Units.
Dec 10 09:29:50 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 10 09:29:50 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 10 09:29:50 localhost systemd[1]: Reached target Socket Units.
Dec 10 09:29:50 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 10 09:29:50 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 10 09:29:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 10 09:29:50 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 10 09:29:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 10 09:29:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 10 09:29:50 localhost systemd-udevd[747]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 09:29:50 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 10 09:29:50 localhost systemd[1]: Reached target Basic System.
Dec 10 09:29:50 localhost dbus-broker-lau[745]: Ready
Dec 10 09:29:50 localhost systemd[1]: Starting NTP client/server...
Dec 10 09:29:50 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 10 09:29:50 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 10 09:29:50 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 10 09:29:50 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 10 09:29:50 localhost systemd[1]: Started irqbalance daemon.
Dec 10 09:29:50 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 10 09:29:50 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 09:29:50 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 09:29:50 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 09:29:50 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 10 09:29:50 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 10 09:29:50 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 10 09:29:50 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 10 09:29:50 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 10 09:29:50 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 10 09:29:50 localhost systemd[1]: Starting User Login Management...
Dec 10 09:29:50 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 10 09:29:50 localhost chronyd[800]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 10 09:29:50 localhost chronyd[800]: Loaded 0 symmetric keys
Dec 10 09:29:50 localhost chronyd[800]: Using right/UTC timezone to obtain leap second data
Dec 10 09:29:50 localhost chronyd[800]: Loaded seccomp filter (level 2)
Dec 10 09:29:50 localhost systemd[1]: Started NTP client/server.
Dec 10 09:29:50 localhost systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 10 09:29:50 localhost systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 10 09:29:50 localhost systemd-logind[787]: New seat seat0.
Dec 10 09:29:50 localhost systemd[1]: Started User Login Management.
Dec 10 09:29:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 10 09:29:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 10 09:29:51 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 10 09:29:51 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 10 09:29:51 localhost kernel: kvm_amd: TSC scaling supported
Dec 10 09:29:51 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 10 09:29:51 localhost kernel: kvm_amd: Nested Paging enabled
Dec 10 09:29:51 localhost kernel: kvm_amd: LBR virtualization supported
Dec 10 09:29:51 localhost kernel: Console: switching to colour dummy device 80x25
Dec 10 09:29:51 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 10 09:29:51 localhost kernel: [drm] features: -context_init
Dec 10 09:29:51 localhost kernel: [drm] number of scanouts: 1
Dec 10 09:29:51 localhost kernel: [drm] number of cap sets: 0
Dec 10 09:29:51 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 10 09:29:51 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 10 09:29:51 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 10 09:29:51 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 10 09:29:51 localhost iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Dec 10 09:29:51 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 10 09:29:51 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 10 Dec 2025 09:29:51 +0000. Up 6.58 seconds.
Dec 10 09:29:51 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 10 09:29:51 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 10 09:29:51 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpc73vr7zf.mount: Deactivated successfully.
Dec 10 09:29:51 localhost systemd[1]: Starting Hostname Service...
Dec 10 09:29:51 localhost systemd[1]: Started Hostname Service.
Dec 10 09:29:51 np0005553242.novalocal systemd-hostnamed[855]: Hostname set to <np0005553242.novalocal> (static)
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Reached target Preparation for Network.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Starting Network Manager...
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8564] NetworkManager (version 1.54.2-1.el9) is starting... (boot:1f343dd7-be59-44c1-890a-3a416daf01a6)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8570] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8696] manager[0x55ea816e2000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8732] hostname: hostname: using hostnamed
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8732] hostname: static hostname changed from (none) to "np0005553242.novalocal"
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8735] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8856] manager[0x55ea816e2000]: rfkill: Wi-Fi hardware radio set enabled
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8856] manager[0x55ea816e2000]: rfkill: WWAN hardware radio set enabled
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8903] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8904] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8904] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8905] manager: Networking is enabled by state file
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8907] settings: Loaded settings plugin: keyfile (internal)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8922] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8951] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8965] dhcp: init: Using DHCP client 'internal'
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8969] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8983] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8991] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.8999] device (lo): Activation: starting connection 'lo' (0756ffbc-1ad3-4f52-9877-3151352e5ed6)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9009] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9011] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9039] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9043] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9045] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9047] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9049] device (eth0): carrier: link connected
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9052] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9057] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9063] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9068] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9068] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9070] manager: NetworkManager state is now CONNECTING
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9071] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9078] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9083] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Started Network Manager.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Reached target Network.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Reached target NFS client services.
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9374] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9376] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 10 09:29:51 np0005553242.novalocal NetworkManager[859]: <info>  [1765358991.9382] device (lo): Activation: successful, device activated.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: Reached target Remote File Systems.
Dec 10 09:29:51 np0005553242.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.8943] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.8959] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.8984] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9015] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9016] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9020] manager: NetworkManager state is now CONNECTED_SITE
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9022] device (eth0): Activation: successful, device activated.
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9029] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 10 09:29:53 np0005553242.novalocal NetworkManager[859]: <info>  [1765358993.9032] manager: startup complete
Dec 10 09:29:53 np0005553242.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 10 09:29:53 np0005553242.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 10 Dec 2025 09:29:54 +0000. Up 9.48 seconds.
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.219         | 255.255.255.0 | global | fa:16:3e:20:93:b1 |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe20:93b1/64 |       .       |  link  | fa:16:3e:20:93:b1 |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Dec 10 09:29:54 np0005553242.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Dec 10 09:29:55 np0005553242.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Generating public/private rsa key pair.
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key fingerprint is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: SHA256:V760mg0bvD+CRJnnrdNsKWJLn0nZuABS22HVJkE/6BI root@np0005553242.novalocal
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key's randomart image is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +---[RSA 3072]----+
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |         .+o     |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |         ..oo    |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |      . Eo.o+    |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |     . +++.o .   |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |    . o.Soo.o    |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |     . ..+.=.o   |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |       .o.B++.   |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |       .++=%*    |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |       ..oX*o.   |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key fingerprint is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: SHA256:GNw76maaLBz9mUAEQ7fbFRMhqmwWLywIyjo3otWhVjY root@np0005553242.novalocal
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key's randomart image is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +---[ECDSA 256]---+
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: | .+.. . =o       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |   o.+ o o       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |. ..o o o        |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |=o +.o + .       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |+.BoE o S        |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |.+.*oo . .       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |+.*..o.o         |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |.*oo o*          |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |.  .++.          |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key fingerprint is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: SHA256:sQcs7zgTYhG2hqeO9jmq+BmuJO/6fQGxiTnMnHi9954 root@np0005553242.novalocal
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: The key's randomart image is:
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +--[ED25519 256]--+
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |    o            |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |   o.o .         |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: | =.==+. +        |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |. X+=. o +       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: | ...oo. S .      |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: | o ...o+ .       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |ooo  .+o.        |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |=o.=. .o..       |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: |*BOoo. .E        |
Dec 10 09:29:55 np0005553242.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Reached target Network is Online.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting System Logging Service...
Dec 10 09:29:55 np0005553242.novalocal sm-notify[1005]: Version 2.5.4 starting
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Permit User Sessions...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 10 09:29:55 np0005553242.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Dec 10 09:29:55 np0005553242.novalocal sshd[1007]: Server listening on :: port 22.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Finished Permit User Sessions.
Dec 10 09:29:55 np0005553242.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Dec 10 09:29:55 np0005553242.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started Command Scheduler.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started Getty on tty1.
Dec 10 09:29:55 np0005553242.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 10 09:29:55 np0005553242.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 10 09:29:55 np0005553242.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 70% if used.)
Dec 10 09:29:55 np0005553242.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Reached target Login Prompts.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Started System Logging Service.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Reached target Multi-User System.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 10 09:29:55 np0005553242.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 09:29:55 np0005553242.novalocal kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec 10 09:29:55 np0005553242.novalocal kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-648.el9.x86_64kdump.img
Dec 10 09:29:55 np0005553242.novalocal cloud-init[1158]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 10 Dec 2025 09:29:55 +0000. Up 11.05 seconds.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 10 09:29:55 np0005553242.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 10 09:29:56 np0005553242.novalocal dracut[1266]: dracut-057-102.git20250818.el9
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-648.el9.x86_64kdump.img 5.14.0-648.el9.x86_64
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1326]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 10 Dec 2025 09:29:56 +0000. Up 11.48 seconds.
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1341]: #############################################################
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1342]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1347]: 256 SHA256:GNw76maaLBz9mUAEQ7fbFRMhqmwWLywIyjo3otWhVjY root@np0005553242.novalocal (ECDSA)
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1349]: 256 SHA256:sQcs7zgTYhG2hqeO9jmq+BmuJO/6fQGxiTnMnHi9954 root@np0005553242.novalocal (ED25519)
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1354]: 3072 SHA256:V760mg0bvD+CRJnnrdNsKWJLn0nZuABS22HVJkE/6BI root@np0005553242.novalocal (RSA)
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1355]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1356]: #############################################################
Dec 10 09:29:56 np0005553242.novalocal cloud-init[1326]: Cloud-init v. 24.4-7.el9 finished at Wed, 10 Dec 2025 09:29:56 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.63 seconds
Dec 10 09:29:56 np0005553242.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 10 09:29:56 np0005553242.novalocal systemd[1]: Reached target Cloud-init target.
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 10 09:29:56 np0005553242.novalocal sshd-session[1552]: Connection reset by 38.102.83.114 port 56866 [preauth]
Dec 10 09:29:56 np0005553242.novalocal sshd-session[1568]: Unable to negotiate with 38.102.83.114 port 56882: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 10 09:29:56 np0005553242.novalocal sshd-session[1575]: Connection closed by 38.102.83.114 port 56886 [preauth]
Dec 10 09:29:56 np0005553242.novalocal sshd-session[1587]: Unable to negotiate with 38.102.83.114 port 56894: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 10 09:29:56 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 10 09:29:56 np0005553242.novalocal sshd-session[1597]: Unable to negotiate with 38.102.83.114 port 56898: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 10 09:29:57 np0005553242.novalocal sshd-session[1656]: Unable to negotiate with 38.102.83.114 port 56932: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 10 09:29:57 np0005553242.novalocal sshd-session[1663]: Unable to negotiate with 38.102.83.114 port 56942: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 10 09:29:57 np0005553242.novalocal sshd-session[1613]: Connection closed by 38.102.83.114 port 56910 [preauth]
Dec 10 09:29:57 np0005553242.novalocal sshd-session[1625]: Connection closed by 38.102.83.114 port 56924 [preauth]
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: memstrack is not available
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: memstrack is not available
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: *** Including module: systemd ***
Dec 10 09:29:57 np0005553242.novalocal dracut[1268]: *** Including module: fips ***
Dec 10 09:29:58 np0005553242.novalocal dracut[1268]: *** Including module: systemd-initrd ***
Dec 10 09:29:58 np0005553242.novalocal dracut[1268]: *** Including module: i18n ***
Dec 10 09:29:58 np0005553242.novalocal dracut[1268]: *** Including module: drm ***
Dec 10 09:29:58 np0005553242.novalocal dracut[1268]: *** Including module: prefixdevname ***
Dec 10 09:29:58 np0005553242.novalocal dracut[1268]: *** Including module: kernel-modules ***
Dec 10 09:29:58 np0005553242.novalocal chronyd[800]: Selected source 54.39.17.239 (2.centos.pool.ntp.org)
Dec 10 09:29:58 np0005553242.novalocal chronyd[800]: System clock TAI offset set to 37 seconds
Dec 10 09:29:58 np0005553242.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: kernel-modules-extra ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: qemu ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: fstab-sys ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: rootfs-block ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: terminfo ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: *** Including module: udev-rules ***
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: Skipping udev rule: 91-permissions.rules
Dec 10 09:29:59 np0005553242.novalocal dracut[1268]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: virtiofs ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: dracut-systemd ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: usrmount ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: base ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: fs-lib ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: kdumpbase ***
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 25 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 31 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 28 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 32 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 30 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 10 09:30:00 np0005553242.novalocal irqbalance[781]: IRQ 29 affinity is now unmanaged
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:   microcode_ctl module: mangling fw_dir
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 10 09:30:00 np0005553242.novalocal chronyd[800]: Selected source 51.222.12.92 (2.centos.pool.ntp.org)
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel" is ignored
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 10 09:30:00 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]: *** Including module: openssl ***
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]: *** Including module: shutdown ***
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]: *** Including module: squash ***
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]: *** Including modules done ***
Dec 10 09:30:01 np0005553242.novalocal dracut[1268]: *** Installing kernel module dependencies ***
Dec 10 09:30:02 np0005553242.novalocal dracut[1268]: *** Installing kernel module dependencies done ***
Dec 10 09:30:02 np0005553242.novalocal dracut[1268]: *** Resolving executable dependencies ***
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: *** Resolving executable dependencies done ***
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: *** Generating early-microcode cpio image ***
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: *** Store current command line parameters ***
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: Stored kernel commandline:
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: No dracut internal kernel commandline stored in the initramfs
Dec 10 09:30:03 np0005553242.novalocal dracut[1268]: *** Install squash loader ***
Dec 10 09:30:03 np0005553242.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 09:30:04 np0005553242.novalocal dracut[1268]: *** Squashing the files inside the initramfs ***
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: *** Squashing the files inside the initramfs done ***
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: *** Creating image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' ***
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: *** Hardlinking files ***
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Mode:           real
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Files:          50
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Linked:         0 files
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Compared:       0 xattrs
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Compared:       0 files
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Saved:          0 B
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: Duration:       0.001077 seconds
Dec 10 09:30:05 np0005553242.novalocal dracut[1268]: *** Hardlinking files done ***
Dec 10 09:30:06 np0005553242.novalocal dracut[1268]: *** Creating initramfs image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' done ***
Dec 10 09:30:06 np0005553242.novalocal kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec 10 09:30:06 np0005553242.novalocal kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec 10 09:30:06 np0005553242.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 10 09:30:06 np0005553242.novalocal systemd[1]: Startup finished in 2.134s (kernel) + 2.552s (initrd) + 17.261s (userspace) = 21.948s.
Dec 10 09:30:13 np0005553242.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 50360 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 10 09:30:13 np0005553242.novalocal systemd-logind[787]: New session 1 of user zuul.
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Queued start job for default target Main User Target.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Created slice User Application Slice.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Reached target Paths.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Reached target Timers.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Reached target Sockets.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Reached target Basic System.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Reached target Main User Target.
Dec 10 09:30:13 np0005553242.novalocal systemd[4300]: Startup finished in 147ms.
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 10 09:30:13 np0005553242.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 10 09:30:13 np0005553242.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:30:13 np0005553242.novalocal python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:30:16 np0005553242.novalocal python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:30:21 np0005553242.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 10 09:30:22 np0005553242.novalocal python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:30:23 np0005553242.novalocal python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 10 09:30:24 np0005553242.novalocal python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDXlBkAgKpcTozle+6OJWLoLjx3EpOgfQGPoP0Cukn8awv+pu3oirieWp2nbDBLTWR94lN6GXeEw2BMtd7sJmqehQWfERtMY3rXCQYmEDCueH6gpdBHdohKslj8z6H92JDVQUQqYAxcOR1Tpc1dZvvOiItnCd4Q1hTOVf2UQJIxbiveiB/EWGQ3yYUtQn6MXnxBI8ks8tituOGOQsO39YvippGOX7NJcf41L+WtC/+NRHplkPOYtZLI1WAyHt+otE3lGf3Jrz8WWy2JY5FK4ZEvyjIpVF+X4zCg+I9bwywmSB3bjQYHHohAZf2wo26xr/xLCoPFPlgNFV1VGMe3+1ShgYT0gh84J7+yX4HhACGoLX0ONEdU/ImzvoTtW9PDvhZdWPUvxWy0msaPurM8J8e5ISO0Zbn1wOFKxIabb3EfM7cZxvTZv8wdTkO4yE7VZb5RvKFZwBucxaR33UOUr20F3Rhs90Mak2SM9VIbuwaR+BMIjkTH+jkqdBUu+eO17VU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:25 np0005553242.novalocal python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:25 np0005553242.novalocal python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:26 np0005553242.novalocal python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765359025.3362465-207-177935284178915/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=ec237d2157b143d88b3e534b87c85acc_id_rsa follow=False checksum=fa5560ef5c4d09ede39c78b7fb48797c93d037fa backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:26 np0005553242.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:26 np0005553242.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765359026.2349873-240-20194209245157/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=ec237d2157b143d88b3e534b87c85acc_id_rsa.pub follow=False checksum=dea46f70c65fa4ce03e8be13d76fca8101d026fa backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:28 np0005553242.novalocal python3[4972]: ansible-ping Invoked with data=pong
Dec 10 09:30:29 np0005553242.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:30:30 np0005553242.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 10 09:30:31 np0005553242.novalocal sshd-session[5058]: Invalid user sol from 193.32.162.146 port 37632
Dec 10 09:30:31 np0005553242.novalocal sshd-session[5058]: Connection closed by invalid user sol 193.32.162.146 port 37632 [preauth]
Dec 10 09:30:31 np0005553242.novalocal python3[5088]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:31 np0005553242.novalocal python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:32 np0005553242.novalocal python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:32 np0005553242.novalocal python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:32 np0005553242.novalocal python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:33 np0005553242.novalocal python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:34 np0005553242.novalocal sudo[5232]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgoekoktqleymwremjzdckllxxsjdosa ; /usr/bin/python3'
Dec 10 09:30:34 np0005553242.novalocal sudo[5232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:34 np0005553242.novalocal python3[5234]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:34 np0005553242.novalocal sudo[5232]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:35 np0005553242.novalocal sudo[5310]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfreeqygqaabjnwtcnoruksbzdcqrhva ; /usr/bin/python3'
Dec 10 09:30:35 np0005553242.novalocal sudo[5310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:35 np0005553242.novalocal python3[5312]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:35 np0005553242.novalocal sudo[5310]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:35 np0005553242.novalocal sudo[5383]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pijywrknahgiasyhryvjhuxixyvxdlot ; /usr/bin/python3'
Dec 10 09:30:35 np0005553242.novalocal sudo[5383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:35 np0005553242.novalocal python3[5385]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765359034.758294-21-24255431382747/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:35 np0005553242.novalocal sudo[5383]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:36 np0005553242.novalocal python3[5433]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:36 np0005553242.novalocal python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:36 np0005553242.novalocal python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:36 np0005553242.novalocal python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:37 np0005553242.novalocal python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:37 np0005553242.novalocal python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:37 np0005553242.novalocal python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:38 np0005553242.novalocal python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:38 np0005553242.novalocal python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:38 np0005553242.novalocal python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:38 np0005553242.novalocal python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:39 np0005553242.novalocal python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:39 np0005553242.novalocal python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:39 np0005553242.novalocal python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:39 np0005553242.novalocal python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:40 np0005553242.novalocal python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:40 np0005553242.novalocal python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:40 np0005553242.novalocal python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:40 np0005553242.novalocal python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:41 np0005553242.novalocal python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:41 np0005553242.novalocal python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:41 np0005553242.novalocal python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:41 np0005553242.novalocal python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:42 np0005553242.novalocal python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:42 np0005553242.novalocal python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:42 np0005553242.novalocal python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:30:45 np0005553242.novalocal sudo[6057]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuqasjgffvldryukrfmeptxvsavcqutq ; /usr/bin/python3'
Dec 10 09:30:45 np0005553242.novalocal sudo[6057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:45 np0005553242.novalocal python3[6059]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 10 09:30:45 np0005553242.novalocal systemd[1]: Starting Time & Date Service...
Dec 10 09:30:45 np0005553242.novalocal systemd[1]: Started Time & Date Service.
Dec 10 09:30:46 np0005553242.novalocal systemd-timedated[6061]: Changed time zone to 'UTC' (UTC).
Dec 10 09:30:46 np0005553242.novalocal sudo[6057]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:46 np0005553242.novalocal sudo[6088]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbowpknygdpowjouaanarufowpijbydd ; /usr/bin/python3'
Dec 10 09:30:46 np0005553242.novalocal sudo[6088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:46 np0005553242.novalocal python3[6090]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:46 np0005553242.novalocal sudo[6088]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:46 np0005553242.novalocal python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:47 np0005553242.novalocal python3[6237]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765359046.6195796-153-10822838055129/source _original_basename=tmp8i6_3xi5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:47 np0005553242.novalocal python3[6337]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:48 np0005553242.novalocal python3[6408]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765359047.4510257-183-93371795129375/source _original_basename=tmp8ioorvb7 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:48 np0005553242.novalocal sudo[6508]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpppruedoxhwhksoawntbunfijuguguh ; /usr/bin/python3'
Dec 10 09:30:48 np0005553242.novalocal sudo[6508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:48 np0005553242.novalocal python3[6510]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:48 np0005553242.novalocal sudo[6508]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:48 np0005553242.novalocal sudo[6581]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izxrzauizmknraejpvvzwkgqegjueghw ; /usr/bin/python3'
Dec 10 09:30:48 np0005553242.novalocal sudo[6581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:49 np0005553242.novalocal python3[6583]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765359048.4585297-231-81050991700273/source _original_basename=tmp_6bqepeg follow=False checksum=873438299bb17ff1128a56bbeb324b7beaf57647 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:49 np0005553242.novalocal sudo[6581]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:49 np0005553242.novalocal python3[6631]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:30:49 np0005553242.novalocal python3[6657]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:30:50 np0005553242.novalocal sudo[6735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uigdckzucezalontgsarwzocanwitfye ; /usr/bin/python3'
Dec 10 09:30:50 np0005553242.novalocal sudo[6735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:50 np0005553242.novalocal python3[6737]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:30:50 np0005553242.novalocal sudo[6735]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:50 np0005553242.novalocal sudo[6808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxrjznrywketyxbdxqcarmwpijpkvvm ; /usr/bin/python3'
Dec 10 09:30:50 np0005553242.novalocal sudo[6808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:50 np0005553242.novalocal python3[6810]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765359050.029474-273-235561427746490/source _original_basename=tmp57wif454 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:30:50 np0005553242.novalocal sudo[6808]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:51 np0005553242.novalocal sudo[6859]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcupaxexulxfpeeqffzsswtgkqmdodci ; /usr/bin/python3'
Dec 10 09:30:51 np0005553242.novalocal sudo[6859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:30:51 np0005553242.novalocal python3[6861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-75e2-5df3-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:30:51 np0005553242.novalocal sudo[6859]: pam_unix(sudo:session): session closed for user root
Dec 10 09:30:51 np0005553242.novalocal python3[6889]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-75e2-5df3-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 10 09:30:53 np0005553242.novalocal python3[6918]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:31:12 np0005553242.novalocal sudo[6942]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaykkckrvbjulgyuqdxjfsybavpgkipj ; /usr/bin/python3'
Dec 10 09:31:12 np0005553242.novalocal sudo[6942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:31:12 np0005553242.novalocal python3[6944]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:31:12 np0005553242.novalocal sudo[6942]: pam_unix(sudo:session): session closed for user root
Dec 10 09:31:16 np0005553242.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 10 09:31:52 np0005553242.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 10 09:31:52 np0005553242.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.1941] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 10 09:31:52 np0005553242.novalocal systemd-udevd[6948]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2141] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2168] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2170] device (eth1): carrier: link connected
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2172] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2176] policy: auto-activating connection 'Wired connection 1' (38855df4-24db-33d9-b8f2-98603420bda3)
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2179] device (eth1): Activation: starting connection 'Wired connection 1' (38855df4-24db-33d9-b8f2-98603420bda3)
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2180] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2182] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2185] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 09:31:52 np0005553242.novalocal NetworkManager[859]: <info>  [1765359112.2189] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:31:54 np0005553242.novalocal python3[6974]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-e9cc-4ee5-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:32:00 np0005553242.novalocal sudo[7052]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvmhgixekrzcltynwwuefuwqvscglylv ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 10 09:32:00 np0005553242.novalocal sudo[7052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:32:01 np0005553242.novalocal python3[7054]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:32:01 np0005553242.novalocal sudo[7052]: pam_unix(sudo:session): session closed for user root
Dec 10 09:32:01 np0005553242.novalocal sudo[7125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgnzcyiummtpkjrtuvlqwtlknhptxcvd ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 10 09:32:01 np0005553242.novalocal sudo[7125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:32:01 np0005553242.novalocal python3[7127]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765359120.7179415-102-235877643068067/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=6daf767d7c5f6a5a8b576bd2635e8b43326443fb backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:32:01 np0005553242.novalocal sudo[7125]: pam_unix(sudo:session): session closed for user root
Dec 10 09:32:01 np0005553242.novalocal sudo[7175]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzsrizqkmuuxhyrxkwphszjsxujvdcdq ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 10 09:32:01 np0005553242.novalocal sudo[7175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:32:02 np0005553242.novalocal python3[7177]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Stopping Network Manager...
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2802] caught SIGTERM, shutting down normally.
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2814] dhcp4 (eth0): canceled DHCP transaction
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2814] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2814] dhcp4 (eth0): state changed no lease
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2817] manager: NetworkManager state is now CONNECTING
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2937] dhcp4 (eth1): canceled DHCP transaction
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2937] dhcp4 (eth1): state changed no lease
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[859]: <info>  [1765359122.2991] exiting (success)
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Stopped Network Manager.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Starting Network Manager...
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.3570] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:1f343dd7-be59-44c1-890a-3a416daf01a6)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.3573] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.3637] manager[0x55776abef000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Starting Hostname Service...
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Started Hostname Service.
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4609] hostname: hostname: using hostnamed
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4610] hostname: static hostname changed from (none) to "np0005553242.novalocal"
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4615] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4622] manager[0x55776abef000]: rfkill: Wi-Fi hardware radio set enabled
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4622] manager[0x55776abef000]: rfkill: WWAN hardware radio set enabled
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4651] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4651] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4652] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4652] manager: Networking is enabled by state file
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4654] settings: Loaded settings plugin: keyfile (internal)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4658] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4680] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4689] dhcp: init: Using DHCP client 'internal'
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4691] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4696] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4700] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4706] device (lo): Activation: starting connection 'lo' (0756ffbc-1ad3-4f52-9877-3151352e5ed6)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4711] device (eth0): carrier: link connected
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4714] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4718] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4718] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4723] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4728] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4733] device (eth1): carrier: link connected
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4737] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4741] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (38855df4-24db-33d9-b8f2-98603420bda3) (indicated)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4742] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4745] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4751] device (eth1): Activation: starting connection 'Wired connection 1' (38855df4-24db-33d9-b8f2-98603420bda3)
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Started Network Manager.
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4756] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4759] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4761] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4763] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4764] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4767] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4769] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4778] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4783] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4790] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4793] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4802] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4805] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4820] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4825] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4830] device (lo): Activation: successful, device activated.
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4837] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4843] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4902] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4926] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4928] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4932] manager: NetworkManager state is now CONNECTED_SITE
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4936] device (eth0): Activation: successful, device activated.
Dec 10 09:32:02 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359122.4942] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 10 09:32:02 np0005553242.novalocal sudo[7175]: pam_unix(sudo:session): session closed for user root
Dec 10 09:32:03 np0005553242.novalocal python3[7261]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-e9cc-4ee5-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:32:12 np0005553242.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 09:32:32 np0005553242.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 10 09:32:39 np0005553242.novalocal systemd[4300]: Starting Mark boot as successful...
Dec 10 09:32:39 np0005553242.novalocal systemd[4300]: Finished Mark boot as successful.
Dec 10 09:32:45 np0005553242.novalocal sshd-session[7267]: Invalid user solana from 193.32.162.146 port 53334
Dec 10 09:32:45 np0005553242.novalocal sshd-session[7267]: Connection closed by invalid user solana 193.32.162.146 port 53334 [preauth]
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7273] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 10 09:32:47 np0005553242.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 09:32:47 np0005553242.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7569] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7573] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7584] device (eth1): Activation: successful, device activated.
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7596] manager: startup complete
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7599] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <warn>  [1765359167.7606] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7629] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7797] dhcp4 (eth1): canceled DHCP transaction
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7798] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7799] dhcp4 (eth1): state changed no lease
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7819] policy: auto-activating connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061)
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7825] device (eth1): Activation: starting connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061)
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7826] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7831] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7840] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7854] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7907] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7909] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 09:32:47 np0005553242.novalocal NetworkManager[7187]: <info>  [1765359167.7919] device (eth1): Activation: successful, device activated.
Dec 10 09:32:57 np0005553242.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 09:33:03 np0005553242.novalocal sshd-session[4309]: Received disconnect from 38.102.83.114 port 50360:11: disconnected by user
Dec 10 09:33:03 np0005553242.novalocal sshd-session[4309]: Disconnected from user zuul 38.102.83.114 port 50360
Dec 10 09:33:03 np0005553242.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:33:03 np0005553242.novalocal systemd-logind[787]: Session 1 logged out. Waiting for processes to exit.
Dec 10 09:33:05 np0005553242.novalocal sshd-session[7292]: Accepted publickey for zuul from 38.102.83.114 port 42444 ssh2: RSA SHA256:B2Noj5c+ufeLikSCib8rdoBjF+7fxxkMjXUJkCp1GYw
Dec 10 09:33:05 np0005553242.novalocal systemd-logind[787]: New session 3 of user zuul.
Dec 10 09:33:05 np0005553242.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 10 09:33:05 np0005553242.novalocal sshd-session[7292]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:33:06 np0005553242.novalocal sudo[7371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kknotjwztxfclmdpgvxnpalxbdmfocbm ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 10 09:33:06 np0005553242.novalocal sudo[7371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:33:06 np0005553242.novalocal python3[7373]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:33:06 np0005553242.novalocal sudo[7371]: pam_unix(sudo:session): session closed for user root
Dec 10 09:33:06 np0005553242.novalocal sudo[7444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onbesccnqcnnvndinzlecqnfbmsbgghm ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 10 09:33:06 np0005553242.novalocal sudo[7444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:33:06 np0005553242.novalocal python3[7446]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765359185.9701052-259-84964720042801/source _original_basename=tmpg32a85i5 follow=False checksum=8ab16b225ba895170597fd625b985e58c2c59185 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:33:06 np0005553242.novalocal sudo[7444]: pam_unix(sudo:session): session closed for user root
Dec 10 09:33:08 np0005553242.novalocal sshd-session[7295]: Connection closed by 38.102.83.114 port 42444
Dec 10 09:33:08 np0005553242.novalocal sshd-session[7292]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:33:08 np0005553242.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 10 09:33:08 np0005553242.novalocal systemd-logind[787]: Session 3 logged out. Waiting for processes to exit.
Dec 10 09:33:08 np0005553242.novalocal systemd-logind[787]: Removed session 3.
Dec 10 09:34:56 np0005553242.novalocal sshd-session[7471]: Invalid user solana from 193.32.162.146 port 40844
Dec 10 09:34:56 np0005553242.novalocal sshd-session[7471]: Connection closed by invalid user solana 193.32.162.146 port 40844 [preauth]
Dec 10 09:35:39 np0005553242.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Dec 10 09:35:39 np0005553242.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Dec 10 09:35:39 np0005553242.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Dec 10 09:37:03 np0005553242.novalocal sshd-session[7476]: Invalid user solana from 193.32.162.146 port 56554
Dec 10 09:37:03 np0005553242.novalocal sshd-session[7476]: Connection closed by invalid user solana 193.32.162.146 port 56554 [preauth]
Dec 10 09:38:52 np0005553242.novalocal sshd-session[7479]: Accepted publickey for zuul from 38.102.83.114 port 59402 ssh2: RSA SHA256:B2Noj5c+ufeLikSCib8rdoBjF+7fxxkMjXUJkCp1GYw
Dec 10 09:38:52 np0005553242.novalocal systemd-logind[787]: New session 4 of user zuul.
Dec 10 09:38:52 np0005553242.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 10 09:38:52 np0005553242.novalocal sshd-session[7479]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:38:52 np0005553242.novalocal sudo[7506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvstvpccjhrcjbdbhfizthkzyfkxkhmv ; /usr/bin/python3'
Dec 10 09:38:52 np0005553242.novalocal sudo[7506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:53 np0005553242.novalocal python3[7508]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-c1e5-8c52-000000001f05-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:38:53 np0005553242.novalocal sudo[7506]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:53 np0005553242.novalocal sudo[7535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlstileehvgrznyubzvitfxwzommvouj ; /usr/bin/python3'
Dec 10 09:38:53 np0005553242.novalocal sudo[7535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:53 np0005553242.novalocal python3[7537]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:53 np0005553242.novalocal sudo[7535]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:53 np0005553242.novalocal sudo[7561]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvjrqqetmxqfinkycfkprxyobuhcqmn ; /usr/bin/python3'
Dec 10 09:38:53 np0005553242.novalocal sudo[7561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:53 np0005553242.novalocal python3[7563]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:53 np0005553242.novalocal sudo[7561]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:53 np0005553242.novalocal sudo[7587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjpasarnzlifpisyanxdtfiyxtehakmf ; /usr/bin/python3'
Dec 10 09:38:53 np0005553242.novalocal sudo[7587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:53 np0005553242.novalocal python3[7589]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:53 np0005553242.novalocal sudo[7587]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:53 np0005553242.novalocal sudo[7613]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnborrujeoktzbjpvgdfncyugmwiyfkd ; /usr/bin/python3'
Dec 10 09:38:53 np0005553242.novalocal sudo[7613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:54 np0005553242.novalocal python3[7615]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:54 np0005553242.novalocal sudo[7613]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:54 np0005553242.novalocal sudo[7639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwunzigfmadbzbdirqoebkmrhkmkchtj ; /usr/bin/python3'
Dec 10 09:38:54 np0005553242.novalocal sudo[7639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:54 np0005553242.novalocal python3[7641]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:54 np0005553242.novalocal sudo[7639]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:54 np0005553242.novalocal sudo[7717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqafhpsimgstzrkubzkguypkcopvsryq ; /usr/bin/python3'
Dec 10 09:38:54 np0005553242.novalocal sudo[7717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:55 np0005553242.novalocal python3[7719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:38:55 np0005553242.novalocal sudo[7717]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:55 np0005553242.novalocal sudo[7790]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haglccmfdfkkiqdzkpudtleyxondedjw ; /usr/bin/python3'
Dec 10 09:38:55 np0005553242.novalocal sudo[7790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:55 np0005553242.novalocal python3[7792]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765359534.7996328-479-183440684755597/source _original_basename=tmp0gjg56mt follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:38:55 np0005553242.novalocal sudo[7790]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:55 np0005553242.novalocal sudo[7840]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egahbbataqtiygywjxsuyythoufivyiu ; /usr/bin/python3'
Dec 10 09:38:55 np0005553242.novalocal sudo[7840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:56 np0005553242.novalocal python3[7842]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 09:38:56 np0005553242.novalocal systemd[1]: Reloading.
Dec 10 09:38:56 np0005553242.novalocal systemd-rc-local-generator[7866]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:38:56 np0005553242.novalocal sudo[7840]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:57 np0005553242.novalocal sudo[7896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulzeocpbjcimskkvxakzxtfeayqfoyho ; /usr/bin/python3'
Dec 10 09:38:57 np0005553242.novalocal sudo[7896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:57 np0005553242.novalocal python3[7898]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 10 09:38:57 np0005553242.novalocal sudo[7896]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:58 np0005553242.novalocal sudo[7922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyswxhlifbryqxcmozpzyiuxgdmspetg ; /usr/bin/python3'
Dec 10 09:38:58 np0005553242.novalocal sudo[7922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:58 np0005553242.novalocal python3[7924]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:38:58 np0005553242.novalocal sudo[7922]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:58 np0005553242.novalocal sudo[7951]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytpmiaiyprherimlmqawzulqskuprraz ; /usr/bin/python3'
Dec 10 09:38:58 np0005553242.novalocal sudo[7951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:58 np0005553242.novalocal python3[7953]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:38:58 np0005553242.novalocal sudo[7951]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:58 np0005553242.novalocal sudo[7979]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glgvluxgwrlhuaeyvdepijhwvivxmqxj ; /usr/bin/python3'
Dec 10 09:38:58 np0005553242.novalocal sudo[7979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:58 np0005553242.novalocal python3[7981]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:38:58 np0005553242.novalocal sudo[7979]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:58 np0005553242.novalocal sudo[8007]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcotevpicmefnsgidiiphhsedsdzyvve ; /usr/bin/python3'
Dec 10 09:38:58 np0005553242.novalocal sudo[8007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:38:59 np0005553242.novalocal python3[8009]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:38:59 np0005553242.novalocal sudo[8007]: pam_unix(sudo:session): session closed for user root
Dec 10 09:38:59 np0005553242.novalocal python3[8036]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-c1e5-8c52-000000001f0c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:39:00 np0005553242.novalocal python3[8066]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 10 09:39:01 np0005553242.novalocal sshd-session[7482]: Connection closed by 38.102.83.114 port 59402
Dec 10 09:39:01 np0005553242.novalocal sshd-session[7479]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:39:01 np0005553242.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 10 09:39:01 np0005553242.novalocal systemd[1]: session-4.scope: Consumed 4.228s CPU time.
Dec 10 09:39:01 np0005553242.novalocal systemd-logind[787]: Session 4 logged out. Waiting for processes to exit.
Dec 10 09:39:01 np0005553242.novalocal systemd-logind[787]: Removed session 4.
Dec 10 09:39:03 np0005553242.novalocal sshd-session[8072]: Accepted publickey for zuul from 38.102.83.114 port 47596 ssh2: RSA SHA256:B2Noj5c+ufeLikSCib8rdoBjF+7fxxkMjXUJkCp1GYw
Dec 10 09:39:03 np0005553242.novalocal systemd-logind[787]: New session 5 of user zuul.
Dec 10 09:39:03 np0005553242.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 10 09:39:03 np0005553242.novalocal sshd-session[8072]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:39:03 np0005553242.novalocal sudo[8099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yueicuoazgrmzczfrvecgjayvayuqdwm ; /usr/bin/python3'
Dec 10 09:39:03 np0005553242.novalocal sudo[8099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:39:03 np0005553242.novalocal python3[8101]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 10 09:39:12 np0005553242.novalocal sshd-session[8145]: Invalid user sol from 193.32.162.146 port 44046
Dec 10 09:39:12 np0005553242.novalocal sshd-session[8145]: Connection closed by invalid user sol 193.32.162.146 port 44046 [preauth]
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:39:18 np0005553242.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:39:27 np0005553242.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:39:37 np0005553242.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:39:38 np0005553242.novalocal setsebool[8170]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 10 09:39:38 np0005553242.novalocal setsebool[8170]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 10 09:39:40 np0005553242.novalocal irqbalance[781]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 10 09:39:40 np0005553242.novalocal irqbalance[781]: IRQ 27 affinity is now unmanaged
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:39:49 np0005553242.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:40:07 np0005553242.novalocal dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 10 09:40:07 np0005553242.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 09:40:07 np0005553242.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 10 09:40:07 np0005553242.novalocal systemd[1]: Reloading.
Dec 10 09:40:07 np0005553242.novalocal systemd-rc-local-generator[8920]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:40:07 np0005553242.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 09:40:09 np0005553242.novalocal sudo[8099]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:09 np0005553242.novalocal python3[10522]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-eb8a-3a10-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:40:10 np0005553242.novalocal kernel: evm: overlay not supported
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Dec 10 09:40:10 np0005553242.novalocal dbus-broker-launch[11722]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 10 09:40:10 np0005553242.novalocal dbus-broker-launch[11722]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: Started D-Bus User Message Bus.
Dec 10 09:40:10 np0005553242.novalocal dbus-broker-lau[11722]: Ready
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: Created slice Slice /user.
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: podman-11602.scope: unit configures an IP firewall, but not running as root.
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: Started podman-11602.scope.
Dec 10 09:40:10 np0005553242.novalocal systemd[4300]: Started podman-pause-053290f1.scope.
Dec 10 09:40:11 np0005553242.novalocal sudo[12399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lifbgmckocphrovxsvfupbxgqdtnsscj ; /usr/bin/python3'
Dec 10 09:40:11 np0005553242.novalocal sudo[12399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:11 np0005553242.novalocal python3[12425]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.129.56.248:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.129.56.248:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:40:11 np0005553242.novalocal python3[12425]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 10 09:40:11 np0005553242.novalocal sudo[12399]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:11 np0005553242.novalocal sshd-session[8075]: Connection closed by 38.102.83.114 port 47596
Dec 10 09:40:11 np0005553242.novalocal sshd-session[8072]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:40:11 np0005553242.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 10 09:40:11 np0005553242.novalocal systemd[1]: session-5.scope: Consumed 1min 1.630s CPU time.
Dec 10 09:40:11 np0005553242.novalocal systemd-logind[787]: Session 5 logged out. Waiting for processes to exit.
Dec 10 09:40:11 np0005553242.novalocal systemd-logind[787]: Removed session 5.
Dec 10 09:40:50 np0005553242.novalocal sshd-session[28373]: Unable to negotiate with 38.102.83.136 port 47492: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 10 09:40:50 np0005553242.novalocal sshd-session[28371]: Connection closed by 38.102.83.136 port 47468 [preauth]
Dec 10 09:40:50 np0005553242.novalocal sshd-session[28375]: Connection closed by 38.102.83.136 port 47482 [preauth]
Dec 10 09:40:50 np0005553242.novalocal sshd-session[28378]: Unable to negotiate with 38.102.83.136 port 47504: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 10 09:40:50 np0005553242.novalocal sshd-session[28379]: Unable to negotiate with 38.102.83.136 port 47506: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 10 09:40:53 np0005553242.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 09:40:53 np0005553242.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 10 09:40:53 np0005553242.novalocal systemd[1]: man-db-cache-update.service: Consumed 56.433s CPU time.
Dec 10 09:40:53 np0005553242.novalocal systemd[1]: run-rbd80b0c4334445b7b89900f26ee9bd60.service: Deactivated successfully.
Dec 10 09:40:53 np0005553242.novalocal sshd-session[29587]: Accepted publickey for zuul from 38.102.83.114 port 53516 ssh2: RSA SHA256:B2Noj5c+ufeLikSCib8rdoBjF+7fxxkMjXUJkCp1GYw
Dec 10 09:40:53 np0005553242.novalocal systemd-logind[787]: New session 6 of user zuul.
Dec 10 09:40:53 np0005553242.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 10 09:40:53 np0005553242.novalocal sshd-session[29587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:40:54 np0005553242.novalocal python3[29614]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ6t0ovugDjwmyWCbbwpPBp7t7Gxh+oMOn6tLlOP5+KQqy931gRaF+c5CU/V/7q7AzyuOj2blIUTf7aB9+xaDns= zuul@np0005553241.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:40:54 np0005553242.novalocal sudo[29638]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsmbfgdclwtvkcmmdfveptjlhzggztus ; /usr/bin/python3'
Dec 10 09:40:54 np0005553242.novalocal sudo[29638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:54 np0005553242.novalocal python3[29640]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ6t0ovugDjwmyWCbbwpPBp7t7Gxh+oMOn6tLlOP5+KQqy931gRaF+c5CU/V/7q7AzyuOj2blIUTf7aB9+xaDns= zuul@np0005553241.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:40:54 np0005553242.novalocal sudo[29638]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:55 np0005553242.novalocal sudo[29664]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtdrtanpozelcemefijyoqdzyihjmwm ; /usr/bin/python3'
Dec 10 09:40:55 np0005553242.novalocal sudo[29664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:55 np0005553242.novalocal python3[29666]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005553242.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 10 09:40:55 np0005553242.novalocal useradd[29668]: new group: name=cloud-admin, GID=1002
Dec 10 09:40:55 np0005553242.novalocal useradd[29668]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 10 09:40:55 np0005553242.novalocal sudo[29664]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:55 np0005553242.novalocal sudo[29698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayruakmadgszmvldchseiuzkoutlbdns ; /usr/bin/python3'
Dec 10 09:40:55 np0005553242.novalocal sudo[29698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:55 np0005553242.novalocal python3[29700]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ6t0ovugDjwmyWCbbwpPBp7t7Gxh+oMOn6tLlOP5+KQqy931gRaF+c5CU/V/7q7AzyuOj2blIUTf7aB9+xaDns= zuul@np0005553241.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 10 09:40:55 np0005553242.novalocal sudo[29698]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:56 np0005553242.novalocal sudo[29776]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcvvhohymjazftoqvjjeipoxnccckupp ; /usr/bin/python3'
Dec 10 09:40:56 np0005553242.novalocal sudo[29776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:56 np0005553242.novalocal python3[29778]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:40:56 np0005553242.novalocal sudo[29776]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:56 np0005553242.novalocal sudo[29849]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwmjjzeauarcnkqscntneklnxtvufthm ; /usr/bin/python3'
Dec 10 09:40:56 np0005553242.novalocal sudo[29849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:56 np0005553242.novalocal python3[29851]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765359655.9611292-137-91183704832058/source _original_basename=tmpz7ceuguq follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:40:56 np0005553242.novalocal sudo[29849]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:57 np0005553242.novalocal sudo[29899]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwdtkzoriyknligebezazvoxddmxgvbr ; /usr/bin/python3'
Dec 10 09:40:57 np0005553242.novalocal sudo[29899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:40:57 np0005553242.novalocal python3[29901]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 10 09:40:57 np0005553242.novalocal systemd[1]: Starting Hostname Service...
Dec 10 09:40:57 np0005553242.novalocal systemd[1]: Started Hostname Service.
Dec 10 09:40:57 np0005553242.novalocal systemd-hostnamed[29905]: Changed pretty hostname to 'compute-0'
Dec 10 09:40:57 compute-0 systemd-hostnamed[29905]: Hostname set to <compute-0> (static)
Dec 10 09:40:57 compute-0 NetworkManager[7187]: <info>  [1765359657.8715] hostname: static hostname changed from "np0005553242.novalocal" to "compute-0"
Dec 10 09:40:57 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 09:40:57 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 09:40:57 compute-0 sudo[29899]: pam_unix(sudo:session): session closed for user root
Dec 10 09:40:58 compute-0 sshd-session[29590]: Connection closed by 38.102.83.114 port 53516
Dec 10 09:40:58 compute-0 sshd-session[29587]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:40:58 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 10 09:40:58 compute-0 systemd[1]: session-6.scope: Consumed 2.439s CPU time.
Dec 10 09:40:58 compute-0 systemd-logind[787]: Session 6 logged out. Waiting for processes to exit.
Dec 10 09:40:58 compute-0 systemd-logind[787]: Removed session 6.
Dec 10 09:41:07 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 09:41:18 compute-0 sshd-session[29918]: Invalid user solv from 193.32.162.146 port 59776
Dec 10 09:41:18 compute-0 sshd-session[29918]: Connection closed by invalid user solv 193.32.162.146 port 59776 [preauth]
Dec 10 09:41:27 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 10 09:43:20 compute-0 sshd-session[29925]: Invalid user solv from 193.32.162.146 port 47248
Dec 10 09:43:21 compute-0 sshd-session[29925]: Connection closed by invalid user solv 193.32.162.146 port 47248 [preauth]
Dec 10 09:44:41 compute-0 sshd-session[29928]: Accepted publickey for zuul from 38.102.83.136 port 59844 ssh2: RSA SHA256:B2Noj5c+ufeLikSCib8rdoBjF+7fxxkMjXUJkCp1GYw
Dec 10 09:44:41 compute-0 systemd-logind[787]: New session 7 of user zuul.
Dec 10 09:44:41 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 10 09:44:41 compute-0 sshd-session[29928]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:44:41 compute-0 python3[30004]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:44:44 compute-0 sudo[30118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knmsdztzgxrudjxqrlqqdmynjsmvyyxs ; /usr/bin/python3'
Dec 10 09:44:44 compute-0 sudo[30118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:44 compute-0 python3[30120]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:44 compute-0 sudo[30118]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:44 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 10 09:44:44 compute-0 sudo[30191]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cphlvltmodxxwcdutrrgoahbiaifbnvz ; /usr/bin/python3'
Dec 10 09:44:44 compute-0 sudo[30191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:44 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 10 09:44:44 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 10 09:44:44 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 10 09:44:44 compute-0 python3[30194]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:45 compute-0 sudo[30191]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:45 compute-0 sudo[30219]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtacizysdpagjbkbvytnkqupbvoxbqge ; /usr/bin/python3'
Dec 10 09:44:45 compute-0 sudo[30219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:45 compute-0 python3[30221]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:45 compute-0 sudo[30219]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:45 compute-0 sudo[30292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksmbqwromnpttnlkowrtyrzllrmwbftq ; /usr/bin/python3'
Dec 10 09:44:45 compute-0 sudo[30292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:45 compute-0 python3[30294]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:45 compute-0 sudo[30292]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:45 compute-0 sudo[30318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sueylikrzwvyouhcmffitvghsqgteknn ; /usr/bin/python3'
Dec 10 09:44:45 compute-0 sudo[30318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:45 compute-0 python3[30320]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:45 compute-0 sudo[30318]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:46 compute-0 sudo[30391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtlamsxjqsipgwjwtnxeluwowrbevqoz ; /usr/bin/python3'
Dec 10 09:44:46 compute-0 sudo[30391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:46 compute-0 python3[30393]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:46 compute-0 sudo[30391]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:46 compute-0 sudo[30417]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hydewhkkndbrturwdosweeugxnyfyjgw ; /usr/bin/python3'
Dec 10 09:44:46 compute-0 sudo[30417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:46 compute-0 python3[30419]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:46 compute-0 sudo[30417]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:46 compute-0 sudo[30490]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rflvbgpuenhaktvxfnaexxrgdqrupcwk ; /usr/bin/python3'
Dec 10 09:44:46 compute-0 sudo[30490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:46 compute-0 python3[30492]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:47 compute-0 sudo[30490]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:47 compute-0 sudo[30516]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpoxwxyotevsniyhhwcmgsetioluizrk ; /usr/bin/python3'
Dec 10 09:44:47 compute-0 sudo[30516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:47 compute-0 python3[30518]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:47 compute-0 sudo[30516]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:47 compute-0 sudo[30589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsruxdblujtjmwrpnmjhvdyarcccjwbv ; /usr/bin/python3'
Dec 10 09:44:47 compute-0 sudo[30589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:47 compute-0 python3[30591]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:47 compute-0 sudo[30589]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:47 compute-0 sudo[30615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzitiyxfdqhpnwbtxfvtbhxjrilfgfsh ; /usr/bin/python3'
Dec 10 09:44:47 compute-0 sudo[30615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:47 compute-0 python3[30617]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:47 compute-0 sudo[30615]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:48 compute-0 sudo[30688]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqdipntfafzpcseopbwjjyunhweczemw ; /usr/bin/python3'
Dec 10 09:44:48 compute-0 sudo[30688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:48 compute-0 python3[30690]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:48 compute-0 sudo[30688]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:48 compute-0 sudo[30714]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myphuicxupgmhprzuvxpwyueqwpgbtwv ; /usr/bin/python3'
Dec 10 09:44:48 compute-0 sudo[30714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:48 compute-0 python3[30716]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 10 09:44:48 compute-0 sudo[30714]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:48 compute-0 sudo[30787]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcvmlbbrroqssdlmnxoggijbgfrlwcej ; /usr/bin/python3'
Dec 10 09:44:48 compute-0 sudo[30787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:44:49 compute-0 python3[30789]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765359884.1984792-33640-240211279609592/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:44:49 compute-0 sudo[30787]: pam_unix(sudo:session): session closed for user root
Dec 10 09:44:51 compute-0 sshd-session[30814]: Unable to negotiate with 192.168.122.11 port 55488: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 10 09:44:51 compute-0 sshd-session[30815]: Connection closed by 192.168.122.11 port 55462 [preauth]
Dec 10 09:44:51 compute-0 sshd-session[30816]: Connection closed by 192.168.122.11 port 55470 [preauth]
Dec 10 09:44:51 compute-0 sshd-session[30818]: Unable to negotiate with 192.168.122.11 port 55476: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 10 09:44:51 compute-0 sshd-session[30817]: Unable to negotiate with 192.168.122.11 port 55498: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 10 09:45:00 compute-0 python3[30848]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:50:00 compute-0 sshd-session[29931]: Received disconnect from 38.102.83.136 port 59844:11: disconnected by user
Dec 10 09:50:00 compute-0 sshd-session[29931]: Disconnected from user zuul 38.102.83.136 port 59844
Dec 10 09:50:00 compute-0 sshd-session[29928]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:50:00 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 10 09:50:00 compute-0 systemd[1]: session-7.scope: Consumed 5.479s CPU time.
Dec 10 09:50:00 compute-0 systemd-logind[787]: Session 7 logged out. Waiting for processes to exit.
Dec 10 09:50:00 compute-0 systemd-logind[787]: Removed session 7.
Dec 10 09:57:06 compute-0 sshd-session[30855]: Accepted publickey for zuul from 192.168.122.30 port 33532 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 09:57:06 compute-0 systemd-logind[787]: New session 8 of user zuul.
Dec 10 09:57:06 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 10 09:57:06 compute-0 sshd-session[30855]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:57:07 compute-0 python3.9[31008]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:08 compute-0 sudo[31187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpgfxneaxuououhkxcstzlrrgdtguite ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360628.4357054-32-233582177499267/AnsiballZ_command.py'
Dec 10 09:57:08 compute-0 sudo[31187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:09 compute-0 python3.9[31189]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:57:16 compute-0 sudo[31187]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:16 compute-0 sshd-session[30858]: Connection closed by 192.168.122.30 port 33532
Dec 10 09:57:16 compute-0 sshd-session[30855]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:57:16 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 10 09:57:16 compute-0 systemd[1]: session-8.scope: Consumed 8.122s CPU time.
Dec 10 09:57:16 compute-0 systemd-logind[787]: Session 8 logged out. Waiting for processes to exit.
Dec 10 09:57:16 compute-0 systemd-logind[787]: Removed session 8.
Dec 10 09:57:21 compute-0 sshd-session[31247]: Accepted publickey for zuul from 192.168.122.30 port 53368 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 09:57:21 compute-0 systemd-logind[787]: New session 9 of user zuul.
Dec 10 09:57:21 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 10 09:57:21 compute-0 sshd-session[31247]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:57:23 compute-0 python3.9[31400]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:23 compute-0 sshd-session[31250]: Connection closed by 192.168.122.30 port 53368
Dec 10 09:57:23 compute-0 sshd-session[31247]: pam_unix(sshd:session): session closed for user zuul
Dec 10 09:57:23 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 10 09:57:23 compute-0 systemd-logind[787]: Session 9 logged out. Waiting for processes to exit.
Dec 10 09:57:23 compute-0 systemd-logind[787]: Removed session 9.
Dec 10 09:57:39 compute-0 sshd-session[31429]: Accepted publickey for zuul from 192.168.122.30 port 56366 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 09:57:39 compute-0 systemd-logind[787]: New session 10 of user zuul.
Dec 10 09:57:39 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 10 09:57:39 compute-0 sshd-session[31429]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 09:57:40 compute-0 python3.9[31582]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 10 09:57:41 compute-0 python3.9[31756]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:41 compute-0 sudo[31906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxxfemfcldbmkcpjwnjsihighrvinev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360661.456944-45-86078219522660/AnsiballZ_command.py'
Dec 10 09:57:41 compute-0 sudo[31906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:42 compute-0 python3.9[31908]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:57:42 compute-0 sudo[31906]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:42 compute-0 sudo[32059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihblhhstquazglkjclpblaeywjwkwaop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360662.3141673-57-137667623672220/AnsiballZ_stat.py'
Dec 10 09:57:42 compute-0 sudo[32059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:43 compute-0 python3.9[32061]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 09:57:43 compute-0 sudo[32059]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:43 compute-0 sudo[32211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajiaufzezhdowcedyddyzsndwbzkvmhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360663.180515-65-125306165414054/AnsiballZ_file.py'
Dec 10 09:57:43 compute-0 sudo[32211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:43 compute-0 python3.9[32213]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:57:43 compute-0 sudo[32211]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:44 compute-0 sudo[32363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeihdupgjsmqpowxvxlhizgknlnfyffv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360663.9654384-73-226593484141536/AnsiballZ_stat.py'
Dec 10 09:57:44 compute-0 sudo[32363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:44 compute-0 python3.9[32365]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 09:57:44 compute-0 sudo[32363]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:44 compute-0 sudo[32486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vufvomhirejcnhwrpkilyzgylkxxphup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360663.9654384-73-226593484141536/AnsiballZ_copy.py'
Dec 10 09:57:44 compute-0 sudo[32486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:45 compute-0 python3.9[32488]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765360663.9654384-73-226593484141536/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:57:45 compute-0 sudo[32486]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:45 compute-0 sudo[32638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xquzqfygufrfnenpmgqvxiiehpunderb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360665.3483613-88-37475757834476/AnsiballZ_setup.py'
Dec 10 09:57:45 compute-0 sudo[32638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:45 compute-0 python3.9[32640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:46 compute-0 sudo[32638]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:46 compute-0 sudo[32794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jupmbjsjwbrxsgdffuummuflityfyjuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360666.295134-96-214424425058873/AnsiballZ_file.py'
Dec 10 09:57:46 compute-0 sudo[32794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:46 compute-0 python3.9[32796]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 09:57:46 compute-0 sudo[32794]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:47 compute-0 sudo[32946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rijyzswsnxmdqaxqaimywzhuzoaruory ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360667.052715-105-89957619882184/AnsiballZ_file.py'
Dec 10 09:57:47 compute-0 sudo[32946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:47 compute-0 python3.9[32948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 09:57:47 compute-0 sudo[32946]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:48 compute-0 python3.9[33098]: ansible-ansible.builtin.service_facts Invoked
Dec 10 09:57:52 compute-0 python3.9[33351]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:57:52 compute-0 python3.9[33501]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:53 compute-0 python3.9[33655]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 09:57:54 compute-0 sudo[33811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eetvnmazjojzxweuwzfozbbvyllcnbma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360674.2878025-153-278267396230396/AnsiballZ_setup.py'
Dec 10 09:57:54 compute-0 sudo[33811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:54 compute-0 python3.9[33813]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 09:57:55 compute-0 sudo[33811]: pam_unix(sudo:session): session closed for user root
Dec 10 09:57:55 compute-0 sudo[33895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwtilbcqbgcdmgvsinwsniyobplgooqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360674.2878025-153-278267396230396/AnsiballZ_dnf.py'
Dec 10 09:57:55 compute-0 sudo[33895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:57:55 compute-0 python3.9[33897]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 09:58:38 compute-0 systemd[1]: Reloading.
Dec 10 09:58:38 compute-0 systemd-rc-local-generator[34093]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:58:38 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 10 09:58:39 compute-0 systemd[1]: Reloading.
Dec 10 09:58:39 compute-0 systemd-rc-local-generator[34133]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:58:39 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 10 09:58:39 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 10 09:58:39 compute-0 systemd[1]: Reloading.
Dec 10 09:58:39 compute-0 systemd-rc-local-generator[34174]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:58:39 compute-0 systemd[1]: Starting dnf makecache...
Dec 10 09:58:39 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 10 09:58:39 compute-0 dnf[34183]: Failed determining last makecache time.
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-openstack-barbican-42b4c41831408a8e323 124 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 09:58:39 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 157 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-openstack-cinder-1c00d6490d88e436f26ef 151 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-python-stevedore-c4acc5639fd2329372142 157 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-python-cloudkitty-tests-tempest-2c80f8 145 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-os-refresh-config-9bfc52b5049be2d8de61 183 kB/s | 3.0 kB     00:00
Dec 10 09:58:39 compute-0 dnf[34183]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 161 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-python-designate-tests-tempest-347fdbc 167 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-glance-1fd12c29b339f30fe823e 152 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 147 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-manila-3c01b7181572c95dac462 106 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-python-whitebox-neutron-tests-tempest- 151 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-octavia-ba397f07a7331190208c 150 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-watcher-c014f81a8647287f6dcc 149 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-ansible-config_template-5ccaa22121a7ff 155 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 154 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-swift-dc98a8463506ac520c469a 163 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-python-tempestconf-8515371b7cceebd4282 146 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: delorean-openstack-heat-ui-013accbfd179753bc3f0 165 kB/s | 3.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: CentOS Stream 9 - BaseOS                         74 kB/s | 7.0 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: CentOS Stream 9 - AppStream                      62 kB/s | 7.4 kB     00:00
Dec 10 09:58:40 compute-0 dnf[34183]: CentOS Stream 9 - CRB                            29 kB/s | 6.9 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: CentOS Stream 9 - Extras packages                65 kB/s | 8.3 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: dlrn-antelope-testing                           142 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: dlrn-antelope-build-deps                        133 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: centos9-rabbitmq                                109 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: centos9-storage                                 132 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: centos9-opstools                                140 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: NFV SIG OpenvSwitch                             150 kB/s | 3.0 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: repo-setup-centos-appstream                     211 kB/s | 4.4 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: repo-setup-centos-baseos                        201 kB/s | 3.9 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: repo-setup-centos-highavailability              188 kB/s | 3.9 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: repo-setup-centos-powertools                    202 kB/s | 4.3 kB     00:00
Dec 10 09:58:41 compute-0 dnf[34183]: Extra Packages for Enterprise Linux 9 - x86_64  242 kB/s |  33 kB     00:00
Dec 10 09:58:42 compute-0 dnf[34183]: Metadata cache created.
Dec 10 09:58:42 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 10 09:58:42 compute-0 systemd[1]: Finished dnf makecache.
Dec 10 09:58:42 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.837s CPU time.
Dec 10 09:59:42 compute-0 kernel: SELinux:  Converting 2719 SID table entries...
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 09:59:42 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 09:59:42 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 10 09:59:42 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 09:59:42 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 09:59:42 compute-0 systemd[1]: Reloading.
Dec 10 09:59:42 compute-0 systemd-rc-local-generator[34548]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 09:59:43 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 09:59:43 compute-0 sudo[33895]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:43 compute-0 sudo[35456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kflbscnyvgdbmxajazaexvzyzphkmcji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360783.6238422-165-237249111243109/AnsiballZ_command.py'
Dec 10 09:59:43 compute-0 sudo[35456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 09:59:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 09:59:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.250s CPU time.
Dec 10 09:59:43 compute-0 systemd[1]: run-rf6f68fcdcb444949a7aa023e5a823797.service: Deactivated successfully.
Dec 10 09:59:44 compute-0 python3.9[35458]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:59:44 compute-0 sudo[35456]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:45 compute-0 sudo[35738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxunbmlkvwkgcuojrgcylncokdpiglxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360785.1394224-173-83129862095588/AnsiballZ_selinux.py'
Dec 10 09:59:45 compute-0 sudo[35738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:46 compute-0 python3.9[35740]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 10 09:59:46 compute-0 sudo[35738]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:46 compute-0 sudo[35890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bisiqidbtjxiopukvfatbybvlxbpdqjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360786.4115334-184-253301639254627/AnsiballZ_command.py'
Dec 10 09:59:46 compute-0 sudo[35890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:46 compute-0 python3.9[35892]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 10 09:59:47 compute-0 sudo[35890]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:48 compute-0 sudo[36044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhluheutjcvshiykatohhrccgfftpagi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360788.204389-192-56299359106820/AnsiballZ_file.py'
Dec 10 09:59:48 compute-0 sudo[36044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:49 compute-0 python3.9[36046]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:59:49 compute-0 sudo[36044]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:50 compute-0 sudo[36196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rimlhkulcvrsrumrctyhzhjxflsoriaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360789.4498663-200-208424631106739/AnsiballZ_mount.py'
Dec 10 09:59:50 compute-0 sudo[36196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:50 compute-0 python3.9[36198]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 10 09:59:50 compute-0 sudo[36196]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:51 compute-0 sudo[36348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szrdqgwnqoduijzhrvwwdxuefsncnukb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360790.9712307-228-2826208697465/AnsiballZ_file.py'
Dec 10 09:59:51 compute-0 sudo[36348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:51 compute-0 python3.9[36350]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 09:59:51 compute-0 sudo[36348]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:52 compute-0 sudo[36500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msyoyceggmspvjlerhtawlwqcmiiratr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360791.71577-236-51987140653413/AnsiballZ_stat.py'
Dec 10 09:59:52 compute-0 sudo[36500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:52 compute-0 python3.9[36502]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 09:59:52 compute-0 sudo[36500]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:52 compute-0 sudo[36623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsfyfttukcfpmslpynqoycgqemfaqhdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360791.71577-236-51987140653413/AnsiballZ_copy.py'
Dec 10 09:59:52 compute-0 sudo[36623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:52 compute-0 python3.9[36625]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765360791.71577-236-51987140653413/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:59:52 compute-0 sudo[36623]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:53 compute-0 sudo[36775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yozqbvgeakfbipiwpctgrdkhdhjhxlco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360793.4382086-260-196032764260705/AnsiballZ_stat.py'
Dec 10 09:59:53 compute-0 sudo[36775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:56 compute-0 python3.9[36777]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 09:59:56 compute-0 sudo[36775]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:56 compute-0 sudo[36927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwjqaoabrohfcewrcqozxobiwcvbieeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360796.4594321-268-127133267186245/AnsiballZ_command.py'
Dec 10 09:59:56 compute-0 sudo[36927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:56 compute-0 python3.9[36929]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 09:59:56 compute-0 sudo[36927]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:57 compute-0 sudo[37080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkdhfrunvhjklteezrfdqqbrdjiqrrcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360797.1777506-276-259135127569276/AnsiballZ_file.py'
Dec 10 09:59:57 compute-0 sudo[37080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:58 compute-0 python3.9[37082]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 09:59:58 compute-0 sudo[37080]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:59 compute-0 sudo[37232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsogrzijcanfvyqtfwkzhochnpvrlebu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360798.9872398-287-249862930623863/AnsiballZ_getent.py'
Dec 10 09:59:59 compute-0 sudo[37232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 09:59:59 compute-0 python3.9[37234]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 10 09:59:59 compute-0 sudo[37232]: pam_unix(sudo:session): session closed for user root
Dec 10 09:59:59 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:00:00 compute-0 sudo[37386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rylqtdwtxbrbcybmmjzbxzvfreeivioi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360799.8373709-295-131911220817084/AnsiballZ_group.py'
Dec 10 10:00:00 compute-0 sudo[37386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:00 compute-0 python3.9[37388]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:00:00 compute-0 groupadd[37389]: group added to /etc/group: name=qemu, GID=107
Dec 10 10:00:00 compute-0 groupadd[37389]: group added to /etc/gshadow: name=qemu
Dec 10 10:00:00 compute-0 groupadd[37389]: new group: name=qemu, GID=107
Dec 10 10:00:00 compute-0 sudo[37386]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:01 compute-0 sudo[37544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuajwfukxtenlnbccebcqxvafnkmukai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360800.7661574-303-87985500477865/AnsiballZ_user.py'
Dec 10 10:00:01 compute-0 sudo[37544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:01 compute-0 python3.9[37546]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 10 10:00:01 compute-0 useradd[37548]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 10 10:00:01 compute-0 sudo[37544]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:02 compute-0 sudo[37704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihoqynkqpyanlbjnspmrqebrmgchsezc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360801.937905-311-145444588008987/AnsiballZ_getent.py'
Dec 10 10:00:02 compute-0 sudo[37704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:02 compute-0 python3.9[37706]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 10 10:00:02 compute-0 sudo[37704]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:02 compute-0 sudo[37857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gffaylgfaeogvkszutxokiyvibroyofc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360802.6372526-319-244495548456347/AnsiballZ_group.py'
Dec 10 10:00:02 compute-0 sudo[37857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:03 compute-0 python3.9[37859]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:00:03 compute-0 groupadd[37860]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 10 10:00:03 compute-0 groupadd[37860]: group added to /etc/gshadow: name=hugetlbfs
Dec 10 10:00:03 compute-0 groupadd[37860]: new group: name=hugetlbfs, GID=42477
Dec 10 10:00:03 compute-0 sudo[37857]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:03 compute-0 sudo[38015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwqntisgwuqhojjffryiwcufgvyfsok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360803.5455024-328-82680632456468/AnsiballZ_file.py'
Dec 10 10:00:03 compute-0 sudo[38015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:04 compute-0 python3.9[38017]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 10 10:00:04 compute-0 sudo[38015]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:06 compute-0 sudo[38168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdnuxyhpjqmbqgoncdngukuwsuqbcefl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360806.1338172-339-147074386365190/AnsiballZ_dnf.py'
Dec 10 10:00:06 compute-0 sudo[38168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:06 compute-0 python3.9[38170]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:00:08 compute-0 sudo[38168]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:08 compute-0 sudo[38321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abubqyipuifdlntpjsolrmaxkyrsklvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360808.388141-347-260991437918568/AnsiballZ_file.py'
Dec 10 10:00:08 compute-0 sudo[38321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:08 compute-0 python3.9[38323]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:08 compute-0 sudo[38321]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:09 compute-0 sudo[38473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plbsmzxciakqypdevoofukgpokfizkxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360809.073895-355-124883634923946/AnsiballZ_stat.py'
Dec 10 10:00:09 compute-0 sudo[38473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:09 compute-0 python3.9[38475]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:00:09 compute-0 sudo[38473]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:09 compute-0 sudo[38596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndxglkkyafiejndepcuumtqzjawdnbse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360809.073895-355-124883634923946/AnsiballZ_copy.py'
Dec 10 10:00:09 compute-0 sudo[38596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:10 compute-0 python3.9[38598]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765360809.073895-355-124883634923946/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:10 compute-0 sudo[38596]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:11 compute-0 sudo[38748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsdczezxjmafqlzaqmpouzbezvjzrlfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360810.3580549-370-196619692343318/AnsiballZ_systemd.py'
Dec 10 10:00:11 compute-0 sudo[38748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:11 compute-0 python3.9[38750]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:00:11 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 10 10:00:11 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 10 10:00:11 compute-0 kernel: Bridge firewalling registered
Dec 10 10:00:11 compute-0 systemd-modules-load[38754]: Inserted module 'br_netfilter'
Dec 10 10:00:11 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 10 10:00:11 compute-0 sudo[38748]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:11 compute-0 sudo[38907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypkkwoxwviahdsveirurfjrqzhqpnfyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360811.6751144-378-104897650908127/AnsiballZ_stat.py'
Dec 10 10:00:11 compute-0 sudo[38907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:12 compute-0 python3.9[38909]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:00:12 compute-0 sudo[38907]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:12 compute-0 sudo[39030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwfvzruplejxgrbpowfbhmviovtmoatn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360811.6751144-378-104897650908127/AnsiballZ_copy.py'
Dec 10 10:00:12 compute-0 sudo[39030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:12 compute-0 python3.9[39032]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765360811.6751144-378-104897650908127/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:12 compute-0 sudo[39030]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:13 compute-0 sudo[39182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gupovoufqafeizcrqinjpjtwmewasslt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360812.9927626-396-44016863809810/AnsiballZ_dnf.py'
Dec 10 10:00:13 compute-0 sudo[39182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:13 compute-0 python3.9[39184]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:00:16 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 10:00:16 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 10:00:16 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:00:16 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:00:16 compute-0 systemd[1]: Reloading.
Dec 10 10:00:16 compute-0 systemd-rc-local-generator[39250]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:00:16 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:00:17 compute-0 sudo[39182]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:18 compute-0 python3.9[40381]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:00:18 compute-0 python3.9[41309]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 10 10:00:19 compute-0 python3.9[42152]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:00:19 compute-0 sudo[43046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlztkxzttvzmnejuhpjyfflwfimbtymy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360819.699786-435-124816820405626/AnsiballZ_command.py'
Dec 10 10:00:19 compute-0 sudo[43046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:20 compute-0 python3.9[43054]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:20 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 10 10:00:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:00:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:00:20 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.825s CPU time.
Dec 10 10:00:20 compute-0 systemd[1]: run-ra11a30e233a2477790f26f9431a3ff87.service: Deactivated successfully.
Dec 10 10:00:20 compute-0 systemd[1]: Starting Authorization Manager...
Dec 10 10:00:20 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 10 10:00:20 compute-0 polkitd[43606]: Started polkitd version 0.117
Dec 10 10:00:20 compute-0 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 10 10:00:20 compute-0 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 10 10:00:20 compute-0 polkitd[43606]: Finished loading, compiling and executing 2 rules
Dec 10 10:00:20 compute-0 systemd[1]: Started Authorization Manager.
Dec 10 10:00:20 compute-0 polkitd[43606]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 10 10:00:20 compute-0 sudo[43046]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:21 compute-0 sudo[43774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvgbsqjourigwejnjoiiloxroogedqco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360821.123498-444-116505115898952/AnsiballZ_systemd.py'
Dec 10 10:00:21 compute-0 sudo[43774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:21 compute-0 python3.9[43776]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:00:21 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 10 10:00:21 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 10 10:00:21 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 10 10:00:21 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 10 10:00:22 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 10 10:00:22 compute-0 sudo[43774]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:22 compute-0 python3.9[43937]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 10 10:00:24 compute-0 sudo[44087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiaigobrmocnsaqhsiryjefnkvifegtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360824.1920974-501-60818683606784/AnsiballZ_systemd.py'
Dec 10 10:00:24 compute-0 sudo[44087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:24 compute-0 python3.9[44089]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:00:24 compute-0 systemd[1]: Reloading.
Dec 10 10:00:24 compute-0 systemd-rc-local-generator[44121]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:00:25 compute-0 sudo[44087]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:25 compute-0 sudo[44277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqerbyqdqpphprohiqcmktgmdyaiwbth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360825.1672075-501-253620180273068/AnsiballZ_systemd.py'
Dec 10 10:00:25 compute-0 sudo[44277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:25 compute-0 python3.9[44279]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:00:25 compute-0 systemd[1]: Reloading.
Dec 10 10:00:25 compute-0 systemd-rc-local-generator[44311]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:00:26 compute-0 sudo[44277]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:26 compute-0 sudo[44467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uprvgiyqqwocxfswerjgigzpdfxcapte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360826.2725735-517-205574942052885/AnsiballZ_command.py'
Dec 10 10:00:26 compute-0 sudo[44467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:26 compute-0 python3.9[44469]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:26 compute-0 sudo[44467]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:27 compute-0 sudo[44620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxftbrxfcnscglikxmyolpjsyxwzcpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360826.9775069-525-15427303261022/AnsiballZ_command.py'
Dec 10 10:00:27 compute-0 sudo[44620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:27 compute-0 python3.9[44622]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:27 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 10 10:00:27 compute-0 sudo[44620]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:27 compute-0 sudo[44773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugcsblegpwpbeikmoaykzuahwqlhansd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360827.6849-533-233357550992887/AnsiballZ_command.py'
Dec 10 10:00:27 compute-0 sudo[44773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:28 compute-0 python3.9[44775]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:29 compute-0 sudo[44773]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:29 compute-0 sudo[44935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiehnnqdgatwqsoqlhpnkbklhzcynsye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360829.7142339-541-134014115278499/AnsiballZ_command.py'
Dec 10 10:00:29 compute-0 sudo[44935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:30 compute-0 python3.9[44937]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:30 compute-0 sudo[44935]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:30 compute-0 sudo[45088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khteuutwdlmcosgygqgctvumcyjrhgpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360830.3128612-549-83170937492007/AnsiballZ_systemd.py'
Dec 10 10:00:30 compute-0 sudo[45088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:30 compute-0 python3.9[45090]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:00:30 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 10 10:00:30 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 10 10:00:30 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 10 10:00:30 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 10 10:00:30 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 10 10:00:30 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 10 10:00:31 compute-0 sudo[45088]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:31 compute-0 sshd-session[31432]: Connection closed by 192.168.122.30 port 56366
Dec 10 10:00:31 compute-0 sshd-session[31429]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:00:31 compute-0 systemd-logind[787]: Session 10 logged out. Waiting for processes to exit.
Dec 10 10:00:31 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 10 10:00:31 compute-0 systemd[1]: session-10.scope: Consumed 2min 15.281s CPU time.
Dec 10 10:00:31 compute-0 systemd-logind[787]: Removed session 10.
Dec 10 10:00:36 compute-0 sshd-session[45120]: Accepted publickey for zuul from 192.168.122.30 port 38222 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:00:36 compute-0 systemd-logind[787]: New session 11 of user zuul.
Dec 10 10:00:36 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 10 10:00:36 compute-0 sshd-session[45120]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:00:37 compute-0 python3.9[45273]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:00:38 compute-0 python3.9[45427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:00:39 compute-0 sudo[45581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumwvytwlzzgkufvmghsfnqhhprfrsst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360839.041234-50-275956331819635/AnsiballZ_command.py'
Dec 10 10:00:39 compute-0 sudo[45581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:39 compute-0 python3.9[45583]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:39 compute-0 sudo[45581]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:40 compute-0 python3.9[45734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:00:41 compute-0 sudo[45888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jygrwrzobuhhpgjsalruugdnpariwlul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360841.032022-70-172850909170797/AnsiballZ_setup.py'
Dec 10 10:00:41 compute-0 sudo[45888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:41 compute-0 python3.9[45890]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:00:41 compute-0 sudo[45888]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:42 compute-0 sudo[45972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawfkcnvhynavtnmdgpugcxzryqqcctc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360841.032022-70-172850909170797/AnsiballZ_dnf.py'
Dec 10 10:00:42 compute-0 sudo[45972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:42 compute-0 python3.9[45974]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:00:43 compute-0 sudo[45972]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:44 compute-0 sudo[46125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgojohalicvcmceqjvfpgxlovzuqfexa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360844.1234457-82-257200169295668/AnsiballZ_setup.py'
Dec 10 10:00:44 compute-0 sudo[46125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:44 compute-0 python3.9[46127]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:00:44 compute-0 sudo[46125]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:45 compute-0 sudo[46296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejhnoezaunlckzpuymmbxzyeiavqwzrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360845.1394553-93-5072642125356/AnsiballZ_file.py'
Dec 10 10:00:45 compute-0 sudo[46296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:45 compute-0 python3.9[46298]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:00:45 compute-0 sudo[46296]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:46 compute-0 sudo[46448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejmxkoacerfjbrhekarxykeuvqcnyxjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360846.0098362-101-87623895684934/AnsiballZ_command.py'
Dec 10 10:00:46 compute-0 sudo[46448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:46 compute-0 python3.9[46450]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:00:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2198126292-merged.mount: Deactivated successfully.
Dec 10 10:00:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck484846220-merged.mount: Deactivated successfully.
Dec 10 10:00:46 compute-0 podman[46451]: 2025-12-10 10:00:46.488231856 +0000 UTC m=+0.050217627 system refresh
Dec 10 10:00:46 compute-0 sudo[46448]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:47 compute-0 sudo[46611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukhluwrcwvtzuuginnydizugabtmpmyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360846.6789324-109-240093672260169/AnsiballZ_stat.py'
Dec 10 10:00:47 compute-0 sudo[46611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:47 compute-0 python3.9[46613]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:00:47 compute-0 sudo[46611]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:00:47 compute-0 sudo[46734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gevzpdhibpixrkvdphucguncygsrufww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360846.6789324-109-240093672260169/AnsiballZ_copy.py'
Dec 10 10:00:47 compute-0 sudo[46734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:47 compute-0 python3.9[46736]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765360846.6789324-109-240093672260169/.source.json follow=False _original_basename=podman_network_config.j2 checksum=61074e216ded1f0845916d11ee06ab1d2659aeee backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:00:48 compute-0 sudo[46734]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:48 compute-0 sudo[46886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwzyrckmzictfxaxeihbutkekgeyjdfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360848.1477752-124-236662383053550/AnsiballZ_stat.py'
Dec 10 10:00:48 compute-0 sudo[46886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:48 compute-0 python3.9[46888]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:00:48 compute-0 sudo[46886]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:48 compute-0 sudo[47009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezxtfnpgbnozotjfvptcxpaoqrbaddnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360848.1477752-124-236662383053550/AnsiballZ_copy.py'
Dec 10 10:00:48 compute-0 sudo[47009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:49 compute-0 python3.9[47011]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765360848.1477752-124-236662383053550/.source.conf follow=False _original_basename=registries.conf.j2 checksum=47ab0eaad1aa9516c1fbb9a53e93b649abbc1993 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:49 compute-0 sudo[47009]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:49 compute-0 sudo[47161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nurzyqgayxbcvycloczoxtbigafgcbdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360849.3985913-140-201205649802781/AnsiballZ_ini_file.py'
Dec 10 10:00:49 compute-0 sudo[47161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:50 compute-0 python3.9[47163]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:50 compute-0 sudo[47161]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:50 compute-0 sudo[47313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghpllibkzjfwrqtifpdxcplpjzfwfqki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360850.1994703-140-83544529113551/AnsiballZ_ini_file.py'
Dec 10 10:00:50 compute-0 sudo[47313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:50 compute-0 python3.9[47315]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:50 compute-0 sudo[47313]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:51 compute-0 sudo[47465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibokzqiamsejfffugnpiuvzjjoylqdrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360850.7983491-140-148162744976095/AnsiballZ_ini_file.py'
Dec 10 10:00:51 compute-0 sudo[47465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:51 compute-0 python3.9[47467]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:51 compute-0 sudo[47465]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:51 compute-0 sudo[47617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugorouvsveetsjbtbwugbtlvocacxudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360851.4567115-140-241409808440797/AnsiballZ_ini_file.py'
Dec 10 10:00:51 compute-0 sudo[47617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:51 compute-0 python3.9[47619]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:00:51 compute-0 sudo[47617]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:52 compute-0 python3.9[47769]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:00:53 compute-0 sudo[47921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oklfyqzphrfftmzuvavddljdtoqbaryl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360852.958103-180-31075498867795/AnsiballZ_dnf.py'
Dec 10 10:00:53 compute-0 sudo[47921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:53 compute-0 python3.9[47923]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:00:54 compute-0 sudo[47921]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:55 compute-0 sudo[48074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nllmyxphbhywemziztdxdlrttwznfjkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360854.8936176-188-226945383743497/AnsiballZ_dnf.py'
Dec 10 10:00:55 compute-0 sudo[48074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:55 compute-0 python3.9[48076]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:00:57 compute-0 sudo[48074]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:57 compute-0 sudo[48234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvcfkqheoyekpnzvvzomzydkyqtnkpyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360857.4767454-198-234012376147478/AnsiballZ_dnf.py'
Dec 10 10:00:57 compute-0 sudo[48234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:00:58 compute-0 python3.9[48236]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:00:59 compute-0 sudo[48234]: pam_unix(sudo:session): session closed for user root
Dec 10 10:00:59 compute-0 sudo[48387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnmfncdqxdctonkclppogujftuiiwurb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360859.5472925-207-94002463796117/AnsiballZ_dnf.py'
Dec 10 10:00:59 compute-0 sudo[48387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:00 compute-0 python3.9[48389]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:01 compute-0 sudo[48387]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:01 compute-0 CROND[48439]: (root) CMD (run-parts /etc/cron.hourly)
Dec 10 10:01:01 compute-0 run-parts[48442]: (/etc/cron.hourly) starting 0anacron
Dec 10 10:01:01 compute-0 anacron[48455]: Anacron started on 2025-12-10
Dec 10 10:01:01 compute-0 anacron[48455]: Will run job `cron.daily' in 34 min.
Dec 10 10:01:01 compute-0 anacron[48455]: Will run job `cron.weekly' in 54 min.
Dec 10 10:01:01 compute-0 anacron[48455]: Will run job `cron.monthly' in 74 min.
Dec 10 10:01:01 compute-0 anacron[48455]: Jobs will be executed sequentially
Dec 10 10:01:01 compute-0 run-parts[48459]: (/etc/cron.hourly) finished 0anacron
Dec 10 10:01:01 compute-0 CROND[48434]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 10 10:01:01 compute-0 sudo[48555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cabtldextolnzitsjyrtdsunxneowtnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360861.6079943-218-133554387338863/AnsiballZ_dnf.py'
Dec 10 10:01:01 compute-0 sudo[48555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:02 compute-0 python3.9[48557]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:04 compute-0 sudo[48555]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:04 compute-0 sudo[48711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jylagsvfknfmheqydlkokqrruecyybff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360864.223378-226-272514314475842/AnsiballZ_dnf.py'
Dec 10 10:01:04 compute-0 sudo[48711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:04 compute-0 python3.9[48713]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:07 compute-0 sudo[48711]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:07 compute-0 sudo[48881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoqzdlavbhpoiipwauxeihfcfcraiqmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360867.3835123-235-205346737830610/AnsiballZ_dnf.py'
Dec 10 10:01:07 compute-0 sudo[48881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:07 compute-0 python3.9[48883]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:09 compute-0 sudo[48881]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:09 compute-0 sudo[49034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocvcjpzivhertgevaavypvcgqkpdjsdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360869.418173-244-220485845058543/AnsiballZ_dnf.py'
Dec 10 10:01:09 compute-0 sudo[49034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:09 compute-0 python3.9[49036]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:22 compute-0 sudo[49034]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:23 compute-0 sudo[49371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omwyzdhweudhxcndopezdwcoiwpxqlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360883.1537166-253-20434281644446/AnsiballZ_dnf.py'
Dec 10 10:01:23 compute-0 sudo[49371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:23 compute-0 python3.9[49373]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:01:24 compute-0 sudo[49371]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:25 compute-0 sudo[49527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tixbqqutsurbeexsunjedmhxqizltkjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360885.3458745-264-191208756896389/AnsiballZ_file.py'
Dec 10 10:01:25 compute-0 sudo[49527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:25 compute-0 python3.9[49529]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:01:25 compute-0 sudo[49527]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:26 compute-0 sudo[49702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgaiqzlsqrjldgtrqvhhvbharwmrbluz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360886.004639-272-76198335781670/AnsiballZ_stat.py'
Dec 10 10:01:26 compute-0 sudo[49702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:26 compute-0 python3.9[49704]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:01:26 compute-0 sudo[49702]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:26 compute-0 sudo[49825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsfgjdchybjkfixpiifjjqtwusdnyqgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360886.004639-272-76198335781670/AnsiballZ_copy.py'
Dec 10 10:01:26 compute-0 sudo[49825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:27 compute-0 python3.9[49827]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765360886.004639-272-76198335781670/.source.json _original_basename=.gyb63ziu follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:01:27 compute-0 sudo[49825]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:27 compute-0 sudo[49977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwaupoowgkkczdhrlnyekyhcpwmxyvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360887.42069-290-48720446651138/AnsiballZ_podman_image.py'
Dec 10 10:01:27 compute-0 sudo[49977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:28 compute-0 python3.9[49979]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:01:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat144334203-lower\x2dmapped.mount: Deactivated successfully.
Dec 10 10:01:33 compute-0 podman[49991]: 2025-12-10 10:01:33.827493893 +0000 UTC m=+5.575009237 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 10 10:01:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:34 compute-0 sudo[49977]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:34 compute-0 sudo[50285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fckggkjgqsmifzvefhbookovbqgnodth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360894.4394503-301-259778368395841/AnsiballZ_podman_image.py'
Dec 10 10:01:34 compute-0 sudo[50285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:34 compute-0 python3.9[50287]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:01:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:44 compute-0 podman[50300]: 2025-12-10 10:01:44.834656773 +0000 UTC m=+9.807237797 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:45 compute-0 sudo[50285]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:45 compute-0 sudo[50595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfoidzdmkjrdxedbyemmlkylkvzzyqax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360905.3957794-311-229094735389834/AnsiballZ_podman_image.py'
Dec 10 10:01:45 compute-0 sudo[50595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:45 compute-0 python3.9[50597]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:01:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:47 compute-0 podman[50610]: 2025-12-10 10:01:47.184979333 +0000 UTC m=+1.241223921 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 10 10:01:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:01:47 compute-0 sudo[50595]: pam_unix(sudo:session): session closed for user root
Dec 10 10:01:48 compute-0 sudo[50843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrfllvbyqbbqatloofjrvjobfrfqxwgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360907.7266948-320-33837343248519/AnsiballZ_podman_image.py'
Dec 10 10:01:48 compute-0 sudo[50843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:01:48 compute-0 python3.9[50845]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:01:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:00 compute-0 podman[50856]: 2025-12-10 10:02:00.340206977 +0000 UTC m=+12.038738956 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 10 10:02:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:00 compute-0 sudo[50843]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:01 compute-0 sudo[51111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqappjqgfbidvgrlreebxcpfvgfkvnby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360920.9437163-331-175082165016481/AnsiballZ_podman_image.py'
Dec 10 10:02:01 compute-0 sudo[51111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:01 compute-0 python3.9[51113]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:02:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:05 compute-0 podman[51125]: 2025-12-10 10:02:05.46142553 +0000 UTC m=+3.965914596 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 10 10:02:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:05 compute-0 sudo[51111]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:06 compute-0 sudo[51386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyttjjoefaqodebwqlwjdclnrtcjfode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360925.9103746-331-169471120430816/AnsiballZ_podman_image.py'
Dec 10 10:02:06 compute-0 sudo[51386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:06 compute-0 python3.9[51388]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 10 10:02:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:10 compute-0 podman[51401]: 2025-12-10 10:02:10.222326827 +0000 UTC m=+3.685300124 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 10 10:02:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:02:10 compute-0 sudo[51386]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:11 compute-0 sshd-session[45123]: Connection closed by 192.168.122.30 port 38222
Dec 10 10:02:11 compute-0 sshd-session[45120]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:02:11 compute-0 systemd-logind[787]: Session 11 logged out. Waiting for processes to exit.
Dec 10 10:02:11 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 10 10:02:11 compute-0 systemd[1]: session-11.scope: Consumed 1min 48.114s CPU time.
Dec 10 10:02:11 compute-0 systemd-logind[787]: Removed session 11.
Dec 10 10:02:16 compute-0 sshd-session[51542]: Accepted publickey for zuul from 192.168.122.30 port 48682 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:02:16 compute-0 systemd-logind[787]: New session 12 of user zuul.
Dec 10 10:02:16 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 10 10:02:16 compute-0 sshd-session[51542]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:02:17 compute-0 python3.9[51695]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:02:18 compute-0 sudo[51849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agbcjgvzyvfysrxzxcsyeuummvwmkfnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360938.4311142-36-46232261538157/AnsiballZ_getent.py'
Dec 10 10:02:18 compute-0 sudo[51849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:19 compute-0 python3.9[51851]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 10 10:02:19 compute-0 sudo[51849]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:19 compute-0 sudo[52002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qldywlombvpkohmonmjjatcnyjifbhhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360939.234796-44-35066789632733/AnsiballZ_group.py'
Dec 10 10:02:19 compute-0 sudo[52002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:19 compute-0 python3.9[52004]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:02:19 compute-0 groupadd[52005]: group added to /etc/group: name=openvswitch, GID=42476
Dec 10 10:02:19 compute-0 groupadd[52005]: group added to /etc/gshadow: name=openvswitch
Dec 10 10:02:19 compute-0 groupadd[52005]: new group: name=openvswitch, GID=42476
Dec 10 10:02:19 compute-0 sudo[52002]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:20 compute-0 sudo[52160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izsgypqiuwuiithkrjadiyqadggfjrpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360940.0894873-52-219409529125723/AnsiballZ_user.py'
Dec 10 10:02:20 compute-0 sudo[52160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:20 compute-0 python3.9[52162]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 10 10:02:20 compute-0 useradd[52164]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 10 10:02:20 compute-0 useradd[52164]: add 'openvswitch' to group 'hugetlbfs'
Dec 10 10:02:20 compute-0 useradd[52164]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 10 10:02:20 compute-0 sudo[52160]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:21 compute-0 sudo[52320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjktocyeqxfgqdieonihnlgbzrpikyiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360941.1982894-62-39640807665408/AnsiballZ_setup.py'
Dec 10 10:02:21 compute-0 sudo[52320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:21 compute-0 python3.9[52322]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:02:22 compute-0 sudo[52320]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:22 compute-0 sudo[52404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcidoltlhtpxcbftbilbovizgujgssvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360941.1982894-62-39640807665408/AnsiballZ_dnf.py'
Dec 10 10:02:22 compute-0 sudo[52404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:22 compute-0 python3.9[52406]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:02:24 compute-0 sudo[52404]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:24 compute-0 sudo[52566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikvvzykercsvzkkgfrarokujarveynzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360944.4934318-76-14113843549820/AnsiballZ_dnf.py'
Dec 10 10:02:24 compute-0 sudo[52566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:24 compute-0 python3.9[52568]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:02:36 compute-0 kernel: SELinux:  Converting 2732 SID table entries...
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 10:02:36 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 10:02:36 compute-0 groupadd[52591]: group added to /etc/group: name=unbound, GID=993
Dec 10 10:02:36 compute-0 groupadd[52591]: group added to /etc/gshadow: name=unbound
Dec 10 10:02:36 compute-0 groupadd[52591]: new group: name=unbound, GID=993
Dec 10 10:02:36 compute-0 useradd[52598]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 10 10:02:36 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 10 10:02:36 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 10 10:02:37 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:02:37 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:02:37 compute-0 systemd[1]: Reloading.
Dec 10 10:02:37 compute-0 systemd-rc-local-generator[53090]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:02:37 compute-0 systemd-sysv-generator[53094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:02:37 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:02:38 compute-0 sudo[52566]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:02:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:02:38 compute-0 systemd[1]: run-r742acdb826a04857bb417927615bbe68.service: Deactivated successfully.
Dec 10 10:02:39 compute-0 sudo[53665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gulgtntxkrbpkwrfcfnuvmeosclqpgji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360958.5978894-84-20142645544133/AnsiballZ_systemd.py'
Dec 10 10:02:39 compute-0 sudo[53665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:39 compute-0 python3.9[53667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:02:39 compute-0 systemd[1]: Reloading.
Dec 10 10:02:39 compute-0 systemd-rc-local-generator[53697]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:02:39 compute-0 systemd-sysv-generator[53701]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:02:39 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 10 10:02:39 compute-0 chown[53709]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 10 10:02:39 compute-0 ovs-ctl[53714]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 10 10:02:40 compute-0 ovs-ctl[53714]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 10 10:02:40 compute-0 ovs-ctl[53714]: Starting ovsdb-server [  OK  ]
Dec 10 10:02:40 compute-0 ovs-vsctl[53764]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 10 10:02:40 compute-0 ovs-vsctl[53783]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"65d7f098-ee7c-47ff-b5dd-8c0c64a94f34\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 10 10:02:40 compute-0 ovs-ctl[53714]: Configuring Open vSwitch system IDs [  OK  ]
Dec 10 10:02:40 compute-0 ovs-ctl[53714]: Enabling remote OVSDB managers [  OK  ]
Dec 10 10:02:40 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 10 10:02:40 compute-0 ovs-vsctl[53789]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 10 10:02:40 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 10 10:02:40 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 10 10:02:40 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 10 10:02:40 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 10 10:02:40 compute-0 ovs-ctl[53833]: Inserting openvswitch module [  OK  ]
Dec 10 10:02:40 compute-0 ovs-ctl[53802]: Starting ovs-vswitchd [  OK  ]
Dec 10 10:02:40 compute-0 ovs-vsctl[53854]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 10 10:02:40 compute-0 ovs-ctl[53802]: Enabling remote OVSDB managers [  OK  ]
Dec 10 10:02:40 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 10 10:02:40 compute-0 systemd[1]: Starting Open vSwitch...
Dec 10 10:02:40 compute-0 systemd[1]: Finished Open vSwitch.
Dec 10 10:02:40 compute-0 sudo[53665]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:41 compute-0 python3.9[54005]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:02:42 compute-0 sudo[54155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpppkomsrsdcdoehqlplakdrsxknpfdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360961.7125075-102-101002064861090/AnsiballZ_sefcontext.py'
Dec 10 10:02:42 compute-0 sudo[54155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:42 compute-0 python3.9[54157]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 10 10:02:43 compute-0 kernel: SELinux:  Converting 2746 SID table entries...
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 10:02:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 10:02:43 compute-0 sudo[54155]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:44 compute-0 python3.9[54312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:02:45 compute-0 sudo[54468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzydzfhaklibjgojnpkojxvfghfxjnbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360965.188133-120-90761290652715/AnsiballZ_dnf.py'
Dec 10 10:02:45 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 10 10:02:45 compute-0 sudo[54468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:45 compute-0 python3.9[54470]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:02:46 compute-0 sudo[54468]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:47 compute-0 sudo[54621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nueybtmpypfvxkoggdtoolgrjxsfidpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360967.1598806-128-16089042768845/AnsiballZ_command.py'
Dec 10 10:02:47 compute-0 sudo[54621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:47 compute-0 python3.9[54623]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:02:48 compute-0 sudo[54621]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:49 compute-0 sudo[54908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hifrqtgdrcoytexlrvdktykrrksnhsec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360968.6482728-136-7354555006754/AnsiballZ_file.py'
Dec 10 10:02:49 compute-0 sudo[54908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:49 compute-0 python3.9[54910]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 10 10:02:49 compute-0 sudo[54908]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:50 compute-0 python3.9[55060]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:02:50 compute-0 sudo[55212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pisqeeqihagytibmupejvoubrdjqxvnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360970.3618152-152-98414470050594/AnsiballZ_dnf.py'
Dec 10 10:02:50 compute-0 sudo[55212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:50 compute-0 python3.9[55214]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:02:52 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:02:52 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:02:52 compute-0 systemd[1]: Reloading.
Dec 10 10:02:52 compute-0 systemd-rc-local-generator[55253]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:02:52 compute-0 systemd-sysv-generator[55256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:02:52 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:02:52 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:02:52 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:02:52 compute-0 systemd[1]: run-r65730e0f313a4c12a6d1b3614ce600d5.service: Deactivated successfully.
Dec 10 10:02:53 compute-0 sudo[55212]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:53 compute-0 sudo[55528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tutezftykadfttfhrppwfjmrumpshzcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360973.3134592-160-11277640100964/AnsiballZ_systemd.py'
Dec 10 10:02:53 compute-0 sudo[55528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:53 compute-0 python3.9[55530]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:02:54 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 10 10:02:54 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 10 10:02:54 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 10 10:02:54 compute-0 systemd[1]: Stopping Network Manager...
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0281] caught SIGTERM, shutting down normally.
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0299] dhcp4 (eth0): canceled DHCP transaction
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0299] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0299] dhcp4 (eth0): state changed no lease
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0302] manager: NetworkManager state is now CONNECTED_SITE
Dec 10 10:02:54 compute-0 NetworkManager[7187]: <info>  [1765360974.0363] exiting (success)
Dec 10 10:02:54 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 10:02:54 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 10:02:54 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 10 10:02:54 compute-0 systemd[1]: Stopped Network Manager.
Dec 10 10:02:54 compute-0 systemd[1]: NetworkManager.service: Consumed 11.903s CPU time, 4.1M memory peak, read 0B from disk, written 25.5K to disk.
Dec 10 10:02:54 compute-0 systemd[1]: Starting Network Manager...
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.1067] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:1f343dd7-be59-44c1-890a-3a416daf01a6)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.1070] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.1151] manager[0x5633acd93000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 10 10:02:54 compute-0 systemd[1]: Starting Hostname Service...
Dec 10 10:02:54 compute-0 systemd[1]: Started Hostname Service.
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2008] hostname: hostname: using hostnamed
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2009] hostname: static hostname changed from (none) to "compute-0"
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2014] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2018] manager[0x5633acd93000]: rfkill: Wi-Fi hardware radio set enabled
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2019] manager[0x5633acd93000]: rfkill: WWAN hardware radio set enabled
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2041] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-ovs.so)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2050] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2051] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2051] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2052] manager: Networking is enabled by state file
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2054] settings: Loaded settings plugin: keyfile (internal)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2058] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2091] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2102] dhcp: init: Using DHCP client 'internal'
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2106] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2112] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2125] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2132] device (lo): Activation: starting connection 'lo' (0756ffbc-1ad3-4f52-9877-3151352e5ed6)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2138] device (eth0): carrier: link connected
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2142] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2146] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2146] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2151] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2156] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2161] device (eth1): carrier: link connected
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2164] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2168] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061) (indicated)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2169] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2172] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2179] device (eth1): Activation: starting connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061)
Dec 10 10:02:54 compute-0 systemd[1]: Started Network Manager.
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2187] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2209] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2212] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2214] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2217] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2221] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2224] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2227] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2231] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2241] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2245] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2258] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2272] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2284] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2286] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2293] device (lo): Activation: successful, device activated.
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2301] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2309] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 10 10:02:54 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2385] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2394] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2396] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2400] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2402] device (eth1): Activation: successful, device activated.
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2421] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2422] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2426] manager: NetworkManager state is now CONNECTED_SITE
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2428] device (eth0): Activation: successful, device activated.
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2433] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 10 10:02:54 compute-0 NetworkManager[55541]: <info>  [1765360974.2435] manager: startup complete
Dec 10 10:02:54 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 10 10:02:54 compute-0 sudo[55528]: pam_unix(sudo:session): session closed for user root
Dec 10 10:02:54 compute-0 sudo[55754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpabdvotpkmkpjfdwgnipmjfkczrywea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360974.4147995-168-194268790495212/AnsiballZ_dnf.py'
Dec 10 10:02:54 compute-0 sudo[55754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:02:54 compute-0 python3.9[55756]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:02:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:02:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:02:59 compute-0 systemd[1]: Reloading.
Dec 10 10:02:59 compute-0 systemd-sysv-generator[55809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:02:59 compute-0 systemd-rc-local-generator[55806]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:02:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:03:00 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:03:00 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:03:00 compute-0 systemd[1]: run-rc0c59caead144bc3b9f5b125fceb604e.service: Deactivated successfully.
Dec 10 10:03:00 compute-0 sudo[55754]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:00 compute-0 sudo[56212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksasjclrvsquqeoxrnhspqsbcagywrud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360980.6472096-180-36741438519565/AnsiballZ_stat.py'
Dec 10 10:03:00 compute-0 sudo[56212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:01 compute-0 python3.9[56214]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:03:01 compute-0 sudo[56212]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:01 compute-0 sudo[56364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgvlnyvdiuuawlrbatnhalngwjibnkud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360981.5261135-189-128831633345557/AnsiballZ_ini_file.py'
Dec 10 10:03:01 compute-0 sudo[56364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:02 compute-0 python3.9[56366]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:02 compute-0 sudo[56364]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:02 compute-0 sudo[56518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joflednuxisdzzfxcuwvuptzauihwdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360982.402831-199-136823213047968/AnsiballZ_ini_file.py'
Dec 10 10:03:02 compute-0 sudo[56518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:02 compute-0 python3.9[56520]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:02 compute-0 sudo[56518]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:03 compute-0 sudo[56670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jogtzorvwnkgqzdehlseqjmjfvqltgme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360982.9929879-199-74541523438627/AnsiballZ_ini_file.py'
Dec 10 10:03:03 compute-0 sudo[56670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:03 compute-0 python3.9[56672]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:03 compute-0 sudo[56670]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:03 compute-0 sudo[56822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flzjytdblaufuplftfswhrwdsqntzfvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360983.6067069-214-111682296326287/AnsiballZ_ini_file.py'
Dec 10 10:03:03 compute-0 sudo[56822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:04 compute-0 python3.9[56824]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:04 compute-0 sudo[56822]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:05 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 10:03:05 compute-0 sudo[56974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zorrvxcwpoxribeaxakyjfoyzozbaysl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360985.1517732-214-14411174233848/AnsiballZ_ini_file.py'
Dec 10 10:03:05 compute-0 sudo[56974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:05 compute-0 python3.9[56976]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:05 compute-0 sudo[56974]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:06 compute-0 sudo[57126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzequqotsjtomuioanqiplvvvlexvggn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360985.842138-229-280457299043348/AnsiballZ_stat.py'
Dec 10 10:03:06 compute-0 sudo[57126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:06 compute-0 python3.9[57128]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:03:06 compute-0 sudo[57126]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:06 compute-0 sudo[57249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdeqiusmqplsedjfoctvgfrkvixdbnnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360985.842138-229-280457299043348/AnsiballZ_copy.py'
Dec 10 10:03:06 compute-0 sudo[57249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:07 compute-0 python3.9[57251]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765360985.842138-229-280457299043348/.source _original_basename=.vsu2w1qn follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:07 compute-0 sudo[57249]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:07 compute-0 sudo[57401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwdqklsjqzwgqbdsfnrcnvivruretgsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360987.255094-244-87295381227281/AnsiballZ_file.py'
Dec 10 10:03:07 compute-0 sudo[57401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:07 compute-0 python3.9[57403]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:07 compute-0 sudo[57401]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:08 compute-0 sudo[57553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aejfmfhewdmaccspxuujcljluckkkjhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360987.9461665-252-172045399696888/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 10 10:03:08 compute-0 sudo[57553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:08 compute-0 python3.9[57555]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 10 10:03:08 compute-0 sudo[57553]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:09 compute-0 sudo[57705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxzdhgvdkholktjbbdqktzocqrgrlbkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360988.827655-261-15063363167222/AnsiballZ_file.py'
Dec 10 10:03:09 compute-0 sudo[57705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:09 compute-0 python3.9[57707]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:09 compute-0 sudo[57705]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:10 compute-0 sudo[57857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxsnuddgoqrsblkdkzpmuenghojoyfdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360989.7407887-271-246177995505044/AnsiballZ_stat.py'
Dec 10 10:03:10 compute-0 sudo[57857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:10 compute-0 sudo[57857]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:10 compute-0 sudo[57980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iddknifcxaxgynuxaecxlmogexkytpip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360989.7407887-271-246177995505044/AnsiballZ_copy.py'
Dec 10 10:03:10 compute-0 sudo[57980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:10 compute-0 sudo[57980]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:11 compute-0 sudo[58132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaulrzejxhlisblvoprhbscrysdpkszl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360991.02274-286-185095467175970/AnsiballZ_slurp.py'
Dec 10 10:03:11 compute-0 sudo[58132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:11 compute-0 python3.9[58134]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 10 10:03:11 compute-0 sudo[58132]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:12 compute-0 sudo[58307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezpzqnvpfucwwkgfwsiuhzzmltqswvzd ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360992.0232632-295-244791785181377/async_wrapper.py j179741478510 300 /home/zuul/.ansible/tmp/ansible-tmp-1765360992.0232632-295-244791785181377/AnsiballZ_edpm_os_net_config.py _'
Dec 10 10:03:12 compute-0 sudo[58307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:12 compute-0 ansible-async_wrapper.py[58309]: Invoked with j179741478510 300 /home/zuul/.ansible/tmp/ansible-tmp-1765360992.0232632-295-244791785181377/AnsiballZ_edpm_os_net_config.py _
Dec 10 10:03:12 compute-0 ansible-async_wrapper.py[58312]: Starting module and watcher
Dec 10 10:03:12 compute-0 ansible-async_wrapper.py[58312]: Start watching 58313 (300)
Dec 10 10:03:12 compute-0 ansible-async_wrapper.py[58313]: Start module (58313)
Dec 10 10:03:12 compute-0 ansible-async_wrapper.py[58309]: Return async_wrapper task started.
Dec 10 10:03:12 compute-0 sudo[58307]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:13 compute-0 python3.9[58314]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 10 10:03:13 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 10 10:03:13 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 10 10:03:13 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 10 10:03:13 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 10 10:03:13 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.1657] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.1681] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2289] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2291] audit: op="connection-add" uuid="af5dde23-5fc0-4c57-9c94-9f699af5aa6f" name="br-ex-br" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2316] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2317] audit: op="connection-add" uuid="d42d62f3-a508-46d9-ac7a-8d43565771dd" name="br-ex-port" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2337] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2338] audit: op="connection-add" uuid="0a7926de-836b-447a-9ea1-af8d80298de6" name="eth1-port" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2352] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2354] audit: op="connection-add" uuid="aa4ae8d9-1808-48e9-b2d0-737459a6254b" name="vlan20-port" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2364] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2365] audit: op="connection-add" uuid="2d3cea2d-ca42-4095-abbf-4be02b3e990e" name="vlan21-port" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2376] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2377] audit: op="connection-add" uuid="0756958d-91a5-46da-bfc6-b70f93963c56" name="vlan22-port" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2396] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2410] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2411] audit: op="connection-add" uuid="dea8e0c1-a4a7-4547-9ff7-ed562bfa6140" name="br-ex-if" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2457] audit: op="connection-update" uuid="9612a26e-51e9-58b1-bfd8-b472df5bf061" name="ci-private-network" args="connection.slave-type,connection.controller,connection.master,connection.port-type,connection.timestamp,ipv6.addresses,ipv6.dns,ipv6.routes,ipv6.addr-gen-mode,ipv6.method,ipv6.routing-rules,ovs-external-ids.data,ovs-interface.type,ipv4.addresses,ipv4.dns,ipv4.never-default,ipv4.routing-rules,ipv4.method,ipv4.routes" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2473] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2474] audit: op="connection-add" uuid="38741002-a18d-44df-89f5-58d58a1e6c32" name="vlan20-if" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2488] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2490] audit: op="connection-add" uuid="524ebbb5-7d1c-4e43-b827-05e1a7404b6b" name="vlan21-if" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2506] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2508] audit: op="connection-add" uuid="7b379d8f-bdcc-416c-bfa2-b3fc8c267ca7" name="vlan22-if" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2517] audit: op="connection-delete" uuid="38855df4-24db-33d9-b8f2-98603420bda3" name="Wired connection 1" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2528] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2530] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2536] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2539] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (af5dde23-5fc0-4c57-9c94-9f699af5aa6f)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2539] audit: op="connection-activate" uuid="af5dde23-5fc0-4c57-9c94-9f699af5aa6f" name="br-ex-br" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2540] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2541] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Success
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2545] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2548] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (d42d62f3-a508-46d9-ac7a-8d43565771dd)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2549] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2549] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2552] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2555] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (0a7926de-836b-447a-9ea1-af8d80298de6)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2556] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2557] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2562] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2565] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (aa4ae8d9-1808-48e9-b2d0-737459a6254b)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2566] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2567] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2570] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2573] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (2d3cea2d-ca42-4095-abbf-4be02b3e990e)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2574] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2575] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2578] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2581] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0756958d-91a5-46da-bfc6-b70f93963c56)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2582] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2583] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2585] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2589] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2590] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2592] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2594] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (dea8e0c1-a4a7-4547-9ff7-ed562bfa6140)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2595] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2598] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2599] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2600] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2601] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2609] device (eth1): disconnecting for new activation request.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2609] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2624] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2628] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2630] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2635] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2636] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2642] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2649] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (38741002-a18d-44df-89f5-58d58a1e6c32)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2651] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2658] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2662] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2665] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2671] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2673] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2681] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2690] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (524ebbb5-7d1c-4e43-b827-05e1a7404b6b)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2691] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2698] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2701] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2704] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2709] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <warn>  [1765360995.2710] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2715] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2723] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (7b379d8f-bdcc-416c-bfa2-b3fc8c267ca7)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2724] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2729] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2731] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2733] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2736] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2757] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2760] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2766] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2769] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2780] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2786] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2792] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2813] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2819] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 systemd-udevd[58320]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:03:15 compute-0 kernel: Timeout policy base is empty
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2838] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2843] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2846] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2848] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2852] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2856] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2860] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2862] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2866] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2870] dhcp4 (eth0): canceled DHCP transaction
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2871] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2871] dhcp4 (eth0): state changed no lease
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2872] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2883] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2888] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58315 uid=0 result="fail" reason="Device is not activated"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2891] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2935] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2939] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 10 10:03:15 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2951] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.2987] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 10 10:03:15 compute-0 kernel: br-ex: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3124] device (eth1): Activation: starting connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3130] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3131] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3137] device (eth1): disconnecting for new activation request.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3137] audit: op="connection-activate" uuid="9612a26e-51e9-58b1-bfd8-b472df5bf061" name="ci-private-network" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3143] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3145] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3152] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3153] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3162] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3163] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3164] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3166] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3169] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3172] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3174] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3177] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3180] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3183] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3185] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3188] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3195] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 systemd-udevd[58321]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:03:15 compute-0 kernel: vlan22: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3224] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3229] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58315 uid=0 result="success"
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3229] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3235] device (eth1): Activation: starting connection 'ci-private-network' (9612a26e-51e9-58b1-bfd8-b472df5bf061)
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3241] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3244] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 kernel: vlan20: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3260] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3276] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3279] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3288] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3299] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3301] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3302] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3306] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3311] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3315] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3319] device (eth1): Activation: successful, device activated.
Dec 10 10:03:15 compute-0 kernel: vlan21: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3335] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3390] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3391] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3394] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3398] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3411] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3447] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3447] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3450] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3454] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3482] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3522] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3524] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 10 10:03:15 compute-0 NetworkManager[55541]: <info>  [1765360995.3529] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 10 10:03:16 compute-0 NetworkManager[55541]: <info>  [1765360996.4717] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58315 uid=0 result="success"
Dec 10 10:03:16 compute-0 NetworkManager[55541]: <info>  [1765360996.6491] checkpoint[0x5633acd68950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 10 10:03:16 compute-0 NetworkManager[55541]: <info>  [1765360996.6495] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58315 uid=0 result="success"
Dec 10 10:03:16 compute-0 sudo[58651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epflyowxtbxqigtdcdeixhokpzbwngkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360996.183616-295-170789599618351/AnsiballZ_async_status.py'
Dec 10 10:03:16 compute-0 sudo[58651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:16 compute-0 NetworkManager[55541]: <info>  [1765360996.9257] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58315 uid=0 result="success"
Dec 10 10:03:16 compute-0 NetworkManager[55541]: <info>  [1765360996.9272] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58315 uid=0 result="success"
Dec 10 10:03:16 compute-0 python3.9[58653]: ansible-ansible.legacy.async_status Invoked with jid=j179741478510.58309 mode=status _async_dir=/root/.ansible_async
Dec 10 10:03:16 compute-0 sudo[58651]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.1116] audit: op="networking-control" arg="global-dns-configuration" pid=58315 uid=0 result="success"
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.1145] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.1176] audit: op="networking-control" arg="global-dns-configuration" pid=58315 uid=0 result="success"
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.1199] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58315 uid=0 result="success"
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.2684] checkpoint[0x5633acd68a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 10 10:03:17 compute-0 NetworkManager[55541]: <info>  [1765360997.2690] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58315 uid=0 result="success"
Dec 10 10:03:17 compute-0 ansible-async_wrapper.py[58313]: Module complete (58313)
Dec 10 10:03:17 compute-0 ansible-async_wrapper.py[58312]: Done in kid B.
Dec 10 10:03:20 compute-0 sudo[58757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkzfhjfwvmxsegxqnpgrhxhqbxuzstdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360996.183616-295-170789599618351/AnsiballZ_async_status.py'
Dec 10 10:03:20 compute-0 sudo[58757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:20 compute-0 python3.9[58759]: ansible-ansible.legacy.async_status Invoked with jid=j179741478510.58309 mode=status _async_dir=/root/.ansible_async
Dec 10 10:03:20 compute-0 sudo[58757]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:20 compute-0 sudo[58856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vylozrfzwlubuoyxdpegjdlijwhkuyxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765360996.183616-295-170789599618351/AnsiballZ_async_status.py'
Dec 10 10:03:20 compute-0 sudo[58856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:21 compute-0 python3.9[58858]: ansible-ansible.legacy.async_status Invoked with jid=j179741478510.58309 mode=cleanup _async_dir=/root/.ansible_async
Dec 10 10:03:21 compute-0 sudo[58856]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:21 compute-0 sudo[59008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymwfajaakuovubtqznbxjtvkqoyxvjnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361001.2349756-322-1196465839937/AnsiballZ_stat.py'
Dec 10 10:03:21 compute-0 sudo[59008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:21 compute-0 python3.9[59010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:03:21 compute-0 sudo[59008]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:22 compute-0 sudo[59131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqrfjmvnvssqegnkzghlbrglomacecsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361001.2349756-322-1196465839937/AnsiballZ_copy.py'
Dec 10 10:03:22 compute-0 sudo[59131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:22 compute-0 python3.9[59133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361001.2349756-322-1196465839937/.source.returncode _original_basename=.6nzr37jr follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:22 compute-0 sudo[59131]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:22 compute-0 sudo[59283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxzfnvpxhakpebxdpvlcwgjpokahsfbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361002.477092-338-260134198439373/AnsiballZ_stat.py'
Dec 10 10:03:22 compute-0 sudo[59283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:22 compute-0 python3.9[59285]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:03:22 compute-0 sudo[59283]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:23 compute-0 sudo[59406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifpjcvydruwnniwzebzjrtabfgsxvpjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361002.477092-338-260134198439373/AnsiballZ_copy.py'
Dec 10 10:03:23 compute-0 sudo[59406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:23 compute-0 python3.9[59408]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361002.477092-338-260134198439373/.source.cfg _original_basename=.m1jillll follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:23 compute-0 sudo[59406]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:23 compute-0 sudo[59559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdfdufqfcommptpkvyneoczbgvzwrhxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361003.6603818-353-232502192527141/AnsiballZ_systemd.py'
Dec 10 10:03:23 compute-0 sudo[59559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:24 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 10 10:03:24 compute-0 python3.9[59561]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:03:24 compute-0 systemd[1]: Reloading Network Manager...
Dec 10 10:03:24 compute-0 NetworkManager[55541]: <info>  [1765361004.3598] audit: op="reload" arg="0" pid=59568 uid=0 result="success"
Dec 10 10:03:24 compute-0 NetworkManager[55541]: <info>  [1765361004.3607] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 10 10:03:24 compute-0 systemd[1]: Reloaded Network Manager.
Dec 10 10:03:24 compute-0 sudo[59559]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:24 compute-0 sshd-session[51545]: Connection closed by 192.168.122.30 port 48682
Dec 10 10:03:24 compute-0 sshd-session[51542]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:03:24 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 10 10:03:24 compute-0 systemd[1]: session-12.scope: Consumed 50.423s CPU time.
Dec 10 10:03:24 compute-0 systemd-logind[787]: Session 12 logged out. Waiting for processes to exit.
Dec 10 10:03:24 compute-0 systemd-logind[787]: Removed session 12.
Dec 10 10:03:30 compute-0 sshd-session[59599]: Accepted publickey for zuul from 192.168.122.30 port 56362 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:03:30 compute-0 systemd-logind[787]: New session 13 of user zuul.
Dec 10 10:03:30 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 10 10:03:30 compute-0 sshd-session[59599]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:03:31 compute-0 python3.9[59752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:03:32 compute-0 python3.9[59906]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:03:33 compute-0 python3.9[60096]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:03:34 compute-0 sshd-session[59602]: Connection closed by 192.168.122.30 port 56362
Dec 10 10:03:34 compute-0 sshd-session[59599]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:03:34 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 10 10:03:34 compute-0 systemd[1]: session-13.scope: Consumed 2.462s CPU time.
Dec 10 10:03:34 compute-0 systemd-logind[787]: Session 13 logged out. Waiting for processes to exit.
Dec 10 10:03:34 compute-0 systemd-logind[787]: Removed session 13.
Dec 10 10:03:34 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 10 10:03:39 compute-0 sshd-session[60124]: Accepted publickey for zuul from 192.168.122.30 port 57018 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:03:39 compute-0 systemd-logind[787]: New session 14 of user zuul.
Dec 10 10:03:39 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 10 10:03:39 compute-0 sshd-session[60124]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:03:40 compute-0 python3.9[60278]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:03:41 compute-0 python3.9[60432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:03:42 compute-0 sudo[60586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fshhwjtwbvlcjkoqwdhcmmiowknhjbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361021.8906255-40-93732237503375/AnsiballZ_setup.py'
Dec 10 10:03:42 compute-0 sudo[60586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:42 compute-0 python3.9[60588]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:03:42 compute-0 sudo[60586]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:43 compute-0 sudo[60671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdxpsyxospzfnrxfhlwlvlvlysmhrkhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361021.8906255-40-93732237503375/AnsiballZ_dnf.py'
Dec 10 10:03:43 compute-0 sudo[60671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:43 compute-0 python3.9[60673]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:03:44 compute-0 sudo[60671]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:45 compute-0 sudo[60824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bubtacjvvbsqibcmroedjpzkhsqykorj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361024.875153-52-196982430028249/AnsiballZ_setup.py'
Dec 10 10:03:45 compute-0 sudo[60824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:45 compute-0 python3.9[60826]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:03:45 compute-0 sudo[60824]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:46 compute-0 sudo[61016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngjygnwvsqeogmvpnvkkpumkyqrqzagv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361025.977863-63-250705891704477/AnsiballZ_file.py'
Dec 10 10:03:46 compute-0 sudo[61016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:46 compute-0 python3.9[61018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:46 compute-0 sudo[61016]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:47 compute-0 sudo[61168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgfizfrazobxtjcptzwxtckjqfqphjji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361026.8288486-71-21492225918999/AnsiballZ_command.py'
Dec 10 10:03:47 compute-0 sudo[61168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:47 compute-0 python3.9[61170]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:03:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:03:47 compute-0 sudo[61168]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:48 compute-0 sudo[61331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfuibzjpbkkxymklfwoaumgbyegrrus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361027.7905405-79-150261249699575/AnsiballZ_stat.py'
Dec 10 10:03:48 compute-0 sudo[61331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:48 compute-0 python3.9[61333]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:03:48 compute-0 sudo[61331]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:48 compute-0 sudo[61409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omaeaqbricrpgxcianbbswrjdqcphuzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361027.7905405-79-150261249699575/AnsiballZ_file.py'
Dec 10 10:03:48 compute-0 sudo[61409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:48 compute-0 python3.9[61411]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:03:48 compute-0 sudo[61409]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:49 compute-0 sudo[61561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylkjdgthxaohqhytlrstpspdydymlull ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361029.0989537-91-54477336578074/AnsiballZ_stat.py'
Dec 10 10:03:49 compute-0 sudo[61561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:49 compute-0 python3.9[61563]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:03:49 compute-0 sudo[61561]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:49 compute-0 sudo[61639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojibudaztekbafbcxhsfsnuongnlzvdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361029.0989537-91-54477336578074/AnsiballZ_file.py'
Dec 10 10:03:49 compute-0 sudo[61639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:50 compute-0 python3.9[61641]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:03:50 compute-0 sudo[61639]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:50 compute-0 sudo[61791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmydvodbrkqcfnoskcgrvetzqjwpftes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361030.2161915-104-144302009284597/AnsiballZ_ini_file.py'
Dec 10 10:03:50 compute-0 sudo[61791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:50 compute-0 python3.9[61793]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:03:50 compute-0 sudo[61791]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:51 compute-0 sudo[61943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkygbovgwewlgsnzmzaedeajvvmuwofq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361030.9987998-104-126752048333065/AnsiballZ_ini_file.py'
Dec 10 10:03:51 compute-0 sudo[61943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:51 compute-0 python3.9[61945]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:03:51 compute-0 sudo[61943]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:51 compute-0 sudo[62095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmoktjrdglnmgytqyertvbmodmbglvpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361031.7243466-104-14822322374458/AnsiballZ_ini_file.py'
Dec 10 10:03:51 compute-0 sudo[62095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:52 compute-0 python3.9[62097]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:03:52 compute-0 sudo[62095]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:52 compute-0 sudo[62247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqdoxikulfhiaobtmhpjotnvbdtasofx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361032.3042555-104-245434407996956/AnsiballZ_ini_file.py'
Dec 10 10:03:52 compute-0 sudo[62247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:52 compute-0 python3.9[62249]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:03:52 compute-0 sudo[62247]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:53 compute-0 sudo[62399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gczgvzopcgpedisoytostlbvqsmkgwth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361033.0211558-135-51736648358785/AnsiballZ_dnf.py'
Dec 10 10:03:53 compute-0 sudo[62399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:53 compute-0 python3.9[62401]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:03:54 compute-0 sudo[62399]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:55 compute-0 sudo[62552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlmomuvrgevwxwbwqvpiugjbhvxpsfhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361035.0998251-146-274494708125786/AnsiballZ_setup.py'
Dec 10 10:03:55 compute-0 sudo[62552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:55 compute-0 python3.9[62554]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:03:55 compute-0 sudo[62552]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:56 compute-0 sudo[62706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxkytfvaahgtncetrixenaqiisinmdov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361035.9214044-154-108564671452227/AnsiballZ_stat.py'
Dec 10 10:03:56 compute-0 sudo[62706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:56 compute-0 python3.9[62708]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:03:56 compute-0 sudo[62706]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:56 compute-0 sudo[62858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trieszwflaqblbugugabdhdsyjdvjcjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361036.5636265-163-94371595190835/AnsiballZ_stat.py'
Dec 10 10:03:56 compute-0 sudo[62858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:56 compute-0 python3.9[62860]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:03:57 compute-0 sudo[62858]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:57 compute-0 sudo[63010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giquozpmpglsffiggrcusmmkekmvgnai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361037.2825446-173-186631570423605/AnsiballZ_command.py'
Dec 10 10:03:57 compute-0 sudo[63010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:57 compute-0 python3.9[63012]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:03:57 compute-0 sudo[63010]: pam_unix(sudo:session): session closed for user root
Dec 10 10:03:58 compute-0 sudo[63163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcwopsbabxiipqgubjscqojmvtzvoxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361038.0266416-183-264126987286070/AnsiballZ_service_facts.py'
Dec 10 10:03:58 compute-0 sudo[63163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:03:58 compute-0 python3.9[63165]: ansible-service_facts Invoked
Dec 10 10:03:58 compute-0 network[63182]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:03:58 compute-0 network[63183]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:03:58 compute-0 network[63184]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:04:01 compute-0 sudo[63163]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:02 compute-0 sudo[63467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibdltigwjqvdmyayaklzitsnsbdulkal ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765361042.4347992-198-238014353816709/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765361042.4347992-198-238014353816709/args'
Dec 10 10:04:02 compute-0 sudo[63467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:02 compute-0 sudo[63467]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:03 compute-0 sudo[63634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgwvfcxvhgihdycicptksuafpgkbwoey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361043.060474-209-96804834306887/AnsiballZ_dnf.py'
Dec 10 10:04:03 compute-0 sudo[63634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:03 compute-0 python3.9[63636]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:04:04 compute-0 sudo[63634]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:05 compute-0 sudo[63787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iooaklrxbmgprhxpsaxbyteflnozggvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361045.3342664-222-133958866783407/AnsiballZ_package_facts.py'
Dec 10 10:04:05 compute-0 sudo[63787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:06 compute-0 python3.9[63789]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 10 10:04:06 compute-0 sudo[63787]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:07 compute-0 sudo[63939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pirdvkcuasxofokdlalkmexnpewmpziy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361046.8681078-232-152054154738526/AnsiballZ_stat.py'
Dec 10 10:04:07 compute-0 sudo[63939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:07 compute-0 python3.9[63941]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:07 compute-0 sudo[63939]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:07 compute-0 sudo[64064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkigklpgxjevwepmprqsccmafjcjqknr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361046.8681078-232-152054154738526/AnsiballZ_copy.py'
Dec 10 10:04:07 compute-0 sudo[64064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:08 compute-0 python3.9[64066]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361046.8681078-232-152054154738526/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:08 compute-0 sudo[64064]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:08 compute-0 sudo[64218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kianwrtafxciyskjohixxvhzdjfalywv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361048.3700569-247-218315622323172/AnsiballZ_stat.py'
Dec 10 10:04:08 compute-0 sudo[64218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:08 compute-0 python3.9[64220]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:08 compute-0 sudo[64218]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:09 compute-0 sudo[64343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jasjfnciinsolcvxzqkvnbxifnxgebrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361048.3700569-247-218315622323172/AnsiballZ_copy.py'
Dec 10 10:04:09 compute-0 sudo[64343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:09 compute-0 python3.9[64345]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361048.3700569-247-218315622323172/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:09 compute-0 sudo[64343]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:10 compute-0 sudo[64497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyzavfmhzptxyathvbdfyphuaaqfjsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361050.0946116-268-249135017173212/AnsiballZ_lineinfile.py'
Dec 10 10:04:10 compute-0 sudo[64497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:10 compute-0 python3.9[64499]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:10 compute-0 sudo[64497]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:11 compute-0 sudo[64651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcnkuudkisiccsgzpzzjaizmeywslhfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361051.4041839-283-38543928656230/AnsiballZ_setup.py'
Dec 10 10:04:11 compute-0 sudo[64651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:12 compute-0 python3.9[64653]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:04:12 compute-0 sudo[64651]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:12 compute-0 sudo[64735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcunizwoqvcmpypfqjjingpyclfifdgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361051.4041839-283-38543928656230/AnsiballZ_systemd.py'
Dec 10 10:04:12 compute-0 sudo[64735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:13 compute-0 python3.9[64737]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:13 compute-0 sudo[64735]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:14 compute-0 sudo[64889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsraokexuvkcdrbtcpsfmcbngxqaxfaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361053.7054405-299-20459117046982/AnsiballZ_setup.py'
Dec 10 10:04:14 compute-0 sudo[64889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:14 compute-0 python3.9[64891]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:04:14 compute-0 sudo[64889]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:15 compute-0 sudo[64973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joxjmtqomxkpfxwgqlsjbmopsbpyobhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361053.7054405-299-20459117046982/AnsiballZ_systemd.py'
Dec 10 10:04:15 compute-0 sudo[64973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:15 compute-0 python3.9[64975]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:04:15 compute-0 chronyd[800]: chronyd exiting
Dec 10 10:04:15 compute-0 systemd[1]: Stopping NTP client/server...
Dec 10 10:04:15 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 10 10:04:15 compute-0 systemd[1]: Stopped NTP client/server.
Dec 10 10:04:15 compute-0 systemd[1]: Starting NTP client/server...
Dec 10 10:04:15 compute-0 chronyd[64983]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 10 10:04:15 compute-0 chronyd[64983]: Frequency -26.336 +/- 0.059 ppm read from /var/lib/chrony/drift
Dec 10 10:04:15 compute-0 chronyd[64983]: Loaded seccomp filter (level 2)
Dec 10 10:04:15 compute-0 systemd[1]: Started NTP client/server.
Dec 10 10:04:15 compute-0 sudo[64973]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:15 compute-0 sshd-session[60127]: Connection closed by 192.168.122.30 port 57018
Dec 10 10:04:15 compute-0 sshd-session[60124]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:04:15 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 10 10:04:15 compute-0 systemd[1]: session-14.scope: Consumed 26.697s CPU time.
Dec 10 10:04:15 compute-0 systemd-logind[787]: Session 14 logged out. Waiting for processes to exit.
Dec 10 10:04:15 compute-0 systemd-logind[787]: Removed session 14.
Dec 10 10:04:20 compute-0 sshd-session[65009]: Accepted publickey for zuul from 192.168.122.30 port 51740 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:04:20 compute-0 systemd-logind[787]: New session 15 of user zuul.
Dec 10 10:04:20 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 10 10:04:20 compute-0 sshd-session[65009]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:04:22 compute-0 python3.9[65162]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:04:22 compute-0 sudo[65316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvtpwnlpvsalhtzkscqhcooqtjreudp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361062.452747-33-25332099938602/AnsiballZ_file.py'
Dec 10 10:04:22 compute-0 sudo[65316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:23 compute-0 python3.9[65318]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:23 compute-0 sudo[65316]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:23 compute-0 sudo[65491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlpqxgkokuqwwalhltkzymgqyfgbbhny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361063.3544564-41-69512746149939/AnsiballZ_stat.py'
Dec 10 10:04:23 compute-0 sudo[65491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:24 compute-0 python3.9[65493]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:24 compute-0 sudo[65491]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:24 compute-0 sudo[65569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlduratgrshhlphwyaoiauwzqqblmowg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361063.3544564-41-69512746149939/AnsiballZ_file.py'
Dec 10 10:04:24 compute-0 sudo[65569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:24 compute-0 python3.9[65571]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.y9yo6yl4 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:24 compute-0 sudo[65569]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:25 compute-0 sudo[65721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haezgbbeizmoszbranjknlbptbvpgwgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361064.8790033-61-54458702194780/AnsiballZ_stat.py'
Dec 10 10:04:25 compute-0 sudo[65721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:25 compute-0 python3.9[65723]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:25 compute-0 sudo[65721]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:25 compute-0 sudo[65844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsnulmvuocalrhbmqhoaemmekduujrll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361064.8790033-61-54458702194780/AnsiballZ_copy.py'
Dec 10 10:04:25 compute-0 sudo[65844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:25 compute-0 python3.9[65846]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361064.8790033-61-54458702194780/.source _original_basename=.mgiidmj9 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:25 compute-0 sudo[65844]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:26 compute-0 sudo[65996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqkzdyklycliofaogifravqjugqqkfrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361066.290613-77-168341333754722/AnsiballZ_file.py'
Dec 10 10:04:26 compute-0 sudo[65996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:26 compute-0 python3.9[65998]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:04:26 compute-0 sudo[65996]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:27 compute-0 sudo[66148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feizwbxixagjusvlepsdkxszcfusvfnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361066.9192643-85-268905799808578/AnsiballZ_stat.py'
Dec 10 10:04:27 compute-0 sudo[66148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:27 compute-0 python3.9[66150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:27 compute-0 sudo[66148]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:27 compute-0 sudo[66271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyygvcjbdvrviyqtcsvzpzbqbvoflsue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361066.9192643-85-268905799808578/AnsiballZ_copy.py'
Dec 10 10:04:27 compute-0 sudo[66271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:27 compute-0 python3.9[66273]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361066.9192643-85-268905799808578/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:04:28 compute-0 sudo[66271]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:28 compute-0 sudo[66423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvctftoxppbohqvsxeluiisixkclgxux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361068.1775951-85-74376890924181/AnsiballZ_stat.py'
Dec 10 10:04:28 compute-0 sudo[66423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:28 compute-0 python3.9[66425]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:28 compute-0 sudo[66423]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:29 compute-0 sudo[66546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjrzndrfbibwcodwxvevnkbbxfnzsanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361068.1775951-85-74376890924181/AnsiballZ_copy.py'
Dec 10 10:04:29 compute-0 sudo[66546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:29 compute-0 python3.9[66548]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361068.1775951-85-74376890924181/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:04:29 compute-0 sudo[66546]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:29 compute-0 sudo[66698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yupdnkpvidavoxokbtcedhcrippyfger ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361069.4312356-114-21935960195545/AnsiballZ_file.py'
Dec 10 10:04:29 compute-0 sudo[66698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:29 compute-0 python3.9[66700]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:29 compute-0 sudo[66698]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:30 compute-0 sudo[66850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlywvewrvsiixtnucbetnrhoyjkvfxai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361070.058203-122-47646259222896/AnsiballZ_stat.py'
Dec 10 10:04:30 compute-0 sudo[66850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:30 compute-0 python3.9[66852]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:30 compute-0 sudo[66850]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:30 compute-0 sudo[66973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcifxfubuqqcbzwycsfzriscqkldisaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361070.058203-122-47646259222896/AnsiballZ_copy.py'
Dec 10 10:04:30 compute-0 sudo[66973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:31 compute-0 python3.9[66975]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361070.058203-122-47646259222896/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:31 compute-0 sudo[66973]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:31 compute-0 sudo[67125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuvskrfdodgeufmurtmojailbzxxcpzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361071.2931032-137-42752614403916/AnsiballZ_stat.py'
Dec 10 10:04:31 compute-0 sudo[67125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:31 compute-0 python3.9[67127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:31 compute-0 sudo[67125]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:32 compute-0 sudo[67248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mesjuhyhfcdverkfgmauyvyvlszhmvaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361071.2931032-137-42752614403916/AnsiballZ_copy.py'
Dec 10 10:04:32 compute-0 sudo[67248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:32 compute-0 python3.9[67250]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361071.2931032-137-42752614403916/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:32 compute-0 sudo[67248]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:33 compute-0 sudo[67400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhgcoousugktpgmwhgiovzmwhgnxegsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361072.5169904-152-100736487096393/AnsiballZ_systemd.py'
Dec 10 10:04:33 compute-0 sudo[67400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:33 compute-0 python3.9[67402]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:33 compute-0 systemd[1]: Reloading.
Dec 10 10:04:33 compute-0 systemd-sysv-generator[67429]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:33 compute-0 systemd-rc-local-generator[67424]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:33 compute-0 systemd[1]: Reloading.
Dec 10 10:04:33 compute-0 systemd-rc-local-generator[67469]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:33 compute-0 systemd-sysv-generator[67473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:33 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 10 10:04:33 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 10 10:04:33 compute-0 sudo[67400]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:34 compute-0 sudo[67628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xobojkelunqssrwwbafneyzsbakotlbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361074.0686994-160-4468938273927/AnsiballZ_stat.py'
Dec 10 10:04:34 compute-0 sudo[67628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:34 compute-0 python3.9[67630]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:34 compute-0 sudo[67628]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:34 compute-0 sudo[67751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aphfvtzbwmozlpqyfkkflxwaxgexxjmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361074.0686994-160-4468938273927/AnsiballZ_copy.py'
Dec 10 10:04:34 compute-0 sudo[67751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:35 compute-0 python3.9[67753]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361074.0686994-160-4468938273927/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:35 compute-0 sudo[67751]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:35 compute-0 sudo[67903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpaaskflzsiimexeuavxljmwzutmkoqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361075.315013-175-57028164535558/AnsiballZ_stat.py'
Dec 10 10:04:35 compute-0 sudo[67903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:35 compute-0 python3.9[67905]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:35 compute-0 sudo[67903]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:36 compute-0 sudo[68026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdiyskrzohnbbxfxfhkfdertqpxyngdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361075.315013-175-57028164535558/AnsiballZ_copy.py'
Dec 10 10:04:36 compute-0 sudo[68026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:36 compute-0 python3.9[68028]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361075.315013-175-57028164535558/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:36 compute-0 sudo[68026]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:36 compute-0 sudo[68178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievxguqbotamlgrshjklgrggckwptprx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361076.4611337-190-41718815040256/AnsiballZ_systemd.py'
Dec 10 10:04:36 compute-0 sudo[68178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:37 compute-0 python3.9[68180]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:37 compute-0 systemd[1]: Reloading.
Dec 10 10:04:37 compute-0 systemd-rc-local-generator[68208]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:37 compute-0 systemd-sysv-generator[68212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:37 compute-0 systemd[1]: Reloading.
Dec 10 10:04:37 compute-0 systemd-sysv-generator[68245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:37 compute-0 systemd-rc-local-generator[68241]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:37 compute-0 systemd[1]: Starting Create netns directory...
Dec 10 10:04:37 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 10 10:04:37 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 10 10:04:37 compute-0 systemd[1]: Finished Create netns directory.
Dec 10 10:04:37 compute-0 sudo[68178]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:38 compute-0 python3.9[68407]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:04:38 compute-0 network[68424]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:04:38 compute-0 network[68425]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:04:38 compute-0 network[68426]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:04:42 compute-0 sudo[68686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeixokqgnmshhrrmqoailikmtulxlulg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361082.207368-206-182183324002440/AnsiballZ_systemd.py'
Dec 10 10:04:42 compute-0 sudo[68686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:42 compute-0 python3.9[68688]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:42 compute-0 systemd[1]: Reloading.
Dec 10 10:04:42 compute-0 systemd-rc-local-generator[68717]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:42 compute-0 systemd-sysv-generator[68723]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:43 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 10 10:04:43 compute-0 iptables.init[68729]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 10 10:04:43 compute-0 iptables.init[68729]: iptables: Flushing firewall rules: [  OK  ]
Dec 10 10:04:43 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 10 10:04:43 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 10 10:04:43 compute-0 sudo[68686]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:43 compute-0 sudo[68923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjrvdhxwieialxkburzbfryzuzwuevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361083.5787978-206-33230909427572/AnsiballZ_systemd.py'
Dec 10 10:04:43 compute-0 sudo[68923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:44 compute-0 python3.9[68925]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:44 compute-0 sudo[68923]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:44 compute-0 sudo[69077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhzimcmxxttqvehcmmndhmccujlfhuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361084.4541688-222-234008424306107/AnsiballZ_systemd.py'
Dec 10 10:04:44 compute-0 sudo[69077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:45 compute-0 python3.9[69079]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:04:46 compute-0 systemd[1]: Reloading.
Dec 10 10:04:46 compute-0 systemd-rc-local-generator[69109]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:04:46 compute-0 systemd-sysv-generator[69112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:04:46 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 10 10:04:46 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 10 10:04:46 compute-0 sudo[69077]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:47 compute-0 sudo[69269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoeyogxyruppxmmzhdtstuajzecotduf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361086.733642-230-32175380302811/AnsiballZ_command.py'
Dec 10 10:04:47 compute-0 sudo[69269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:47 compute-0 python3.9[69271]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:04:47 compute-0 sudo[69269]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:48 compute-0 sudo[69422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxxarpvxbrtikdbjmpnyoxfwzqzqepie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361087.801573-244-69759744101076/AnsiballZ_stat.py'
Dec 10 10:04:48 compute-0 sudo[69422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:48 compute-0 python3.9[69424]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:48 compute-0 sudo[69422]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:48 compute-0 sudo[69547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvosjjpvbahzvnfkjzswzbgeqbpevmav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361087.801573-244-69759744101076/AnsiballZ_copy.py'
Dec 10 10:04:48 compute-0 sudo[69547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:48 compute-0 python3.9[69549]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361087.801573-244-69759744101076/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:48 compute-0 sudo[69547]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:49 compute-0 sudo[69700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uomkxdggaxkslhkwetlzdjtkwiblimkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361089.1382024-259-123276860852404/AnsiballZ_systemd.py'
Dec 10 10:04:49 compute-0 sudo[69700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:49 compute-0 python3.9[69702]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:04:49 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 10 10:04:49 compute-0 sshd[1007]: Received SIGHUP; restarting.
Dec 10 10:04:49 compute-0 sshd[1007]: Server listening on 0.0.0.0 port 22.
Dec 10 10:04:49 compute-0 sshd[1007]: Server listening on :: port 22.
Dec 10 10:04:49 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 10 10:04:49 compute-0 sudo[69700]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:50 compute-0 sudo[69856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aytomnbrkvnwgnmjhncrkzafsdogqfqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361089.998479-267-93141141998921/AnsiballZ_file.py'
Dec 10 10:04:50 compute-0 sudo[69856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:50 compute-0 python3.9[69858]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:50 compute-0 sudo[69856]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:51 compute-0 sudo[70008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwnvxeddajvxpadfgkxcqmdkizskywwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361090.7150185-275-211584562577077/AnsiballZ_stat.py'
Dec 10 10:04:51 compute-0 sudo[70008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:51 compute-0 python3.9[70010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:51 compute-0 sudo[70008]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:51 compute-0 sudo[70131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acxllmlulxulfkhcayqcxhvzmgzxfqwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361090.7150185-275-211584562577077/AnsiballZ_copy.py'
Dec 10 10:04:51 compute-0 sudo[70131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:51 compute-0 python3.9[70133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361090.7150185-275-211584562577077/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:51 compute-0 sudo[70131]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:52 compute-0 sudo[70283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohctydezvfdnnispwhhudfykieskkolf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361092.0751908-293-137171477526650/AnsiballZ_timezone.py'
Dec 10 10:04:52 compute-0 sudo[70283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:52 compute-0 python3.9[70285]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 10 10:04:52 compute-0 systemd[1]: Starting Time & Date Service...
Dec 10 10:04:52 compute-0 systemd[1]: Started Time & Date Service.
Dec 10 10:04:52 compute-0 sudo[70283]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:53 compute-0 sudo[70439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsigidzahzafkdoauhqsuabcgdolftml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361093.0936112-302-278341044594396/AnsiballZ_file.py'
Dec 10 10:04:53 compute-0 sudo[70439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:53 compute-0 python3.9[70441]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:53 compute-0 sudo[70439]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:54 compute-0 sudo[70591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nefjeucvgkbzjfmxkpctunjebsjumeza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361093.7938488-310-179895125403513/AnsiballZ_stat.py'
Dec 10 10:04:54 compute-0 sudo[70591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:54 compute-0 python3.9[70593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:54 compute-0 sudo[70591]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:54 compute-0 sudo[70714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbogkduaydqtxpdrlcymeaoycddihobu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361093.7938488-310-179895125403513/AnsiballZ_copy.py'
Dec 10 10:04:54 compute-0 sudo[70714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:54 compute-0 python3.9[70716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361093.7938488-310-179895125403513/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:54 compute-0 sudo[70714]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:55 compute-0 sudo[70866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbvgyvqllwzllqdxsmxnxinyyordrmlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361094.928627-325-101082306612084/AnsiballZ_stat.py'
Dec 10 10:04:55 compute-0 sudo[70866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:55 compute-0 python3.9[70868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:55 compute-0 sudo[70866]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:55 compute-0 sudo[70989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeypcshehkhpcxcxjfzjgrsdtwnnskqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361094.928627-325-101082306612084/AnsiballZ_copy.py'
Dec 10 10:04:55 compute-0 sudo[70989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:56 compute-0 python3.9[70991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361094.928627-325-101082306612084/.source.yaml _original_basename=.zjv_ea2x follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:56 compute-0 sudo[70989]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:56 compute-0 sudo[71141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaneazhkdexioprlgmznugwdvejluvzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361096.1953804-340-149807432356102/AnsiballZ_stat.py'
Dec 10 10:04:56 compute-0 sudo[71141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:56 compute-0 python3.9[71143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:04:56 compute-0 sudo[71141]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:57 compute-0 sudo[71264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcwtnfdeqypjvmszfdxobxyqiemknlju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361096.1953804-340-149807432356102/AnsiballZ_copy.py'
Dec 10 10:04:57 compute-0 sudo[71264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:57 compute-0 python3.9[71266]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361096.1953804-340-149807432356102/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:04:57 compute-0 sudo[71264]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:57 compute-0 sudo[71416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxknxrokopxaiyfoxlujrskfimodzti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361097.5651855-355-94145259811697/AnsiballZ_command.py'
Dec 10 10:04:57 compute-0 sudo[71416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:58 compute-0 python3.9[71418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:04:58 compute-0 sudo[71416]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:58 compute-0 sudo[71569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqrjbafeyeflivknewkkxhirypvgutlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361098.2356796-363-42131090161105/AnsiballZ_command.py'
Dec 10 10:04:58 compute-0 sudo[71569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:58 compute-0 python3.9[71571]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:04:58 compute-0 sudo[71569]: pam_unix(sudo:session): session closed for user root
Dec 10 10:04:59 compute-0 sudo[71722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlvyyyeiuyuboowalojsicembvhcpaev ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361098.900264-371-30901069480178/AnsiballZ_edpm_nftables_from_files.py'
Dec 10 10:04:59 compute-0 sudo[71722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:04:59 compute-0 python3[71724]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 10 10:04:59 compute-0 sudo[71722]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:00 compute-0 sudo[71874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prdcuoydqildlqflnppseslldjafsjwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361099.7075317-379-155868979070462/AnsiballZ_stat.py'
Dec 10 10:05:00 compute-0 sudo[71874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:00 compute-0 python3.9[71876]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:00 compute-0 sudo[71874]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:00 compute-0 sudo[71997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvctexppzgwammritrpoqacktvmkqvsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361099.7075317-379-155868979070462/AnsiballZ_copy.py'
Dec 10 10:05:00 compute-0 sudo[71997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:00 compute-0 python3.9[71999]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361099.7075317-379-155868979070462/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:00 compute-0 sudo[71997]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:01 compute-0 sudo[72149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bodgmuaakhspvayzkcqwnjjndgpelkzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361100.9905996-394-136162619999135/AnsiballZ_stat.py'
Dec 10 10:05:01 compute-0 sudo[72149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:01 compute-0 python3.9[72151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:01 compute-0 sudo[72149]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:01 compute-0 sudo[72272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgoxbjxthdoecbdxhjgvlsyolrmbvuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361100.9905996-394-136162619999135/AnsiballZ_copy.py'
Dec 10 10:05:01 compute-0 sudo[72272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:02 compute-0 python3.9[72274]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361100.9905996-394-136162619999135/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:02 compute-0 sudo[72272]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:02 compute-0 sudo[72424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eckvixiidefjfsbdvdghqcsdskvxodzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361102.2128966-409-107478441208909/AnsiballZ_stat.py'
Dec 10 10:05:02 compute-0 sudo[72424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:02 compute-0 python3.9[72426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:02 compute-0 sudo[72424]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:03 compute-0 sudo[72547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hknggzcgkwtcizwbhhibwggmzgcywxot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361102.2128966-409-107478441208909/AnsiballZ_copy.py'
Dec 10 10:05:03 compute-0 sudo[72547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:03 compute-0 python3.9[72549]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361102.2128966-409-107478441208909/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:03 compute-0 sudo[72547]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:03 compute-0 sudo[72699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oynfripyrosooazptaausdnjfzhqcmvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361103.4198093-424-11975094717055/AnsiballZ_stat.py'
Dec 10 10:05:03 compute-0 sudo[72699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:03 compute-0 python3.9[72701]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:03 compute-0 sudo[72699]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:04 compute-0 sudo[72822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joufgeznbfiuttmgshymnokinkzoqlbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361103.4198093-424-11975094717055/AnsiballZ_copy.py'
Dec 10 10:05:04 compute-0 sudo[72822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:04 compute-0 python3.9[72824]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361103.4198093-424-11975094717055/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:04 compute-0 sudo[72822]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:05 compute-0 sudo[72974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jotmrdkxhipkiooywuwkhfytobxtmdwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361105.0354462-439-155150012281463/AnsiballZ_stat.py'
Dec 10 10:05:05 compute-0 sudo[72974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:05 compute-0 python3.9[72976]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:05 compute-0 sudo[72974]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:05 compute-0 sudo[73097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afpuujbobkaurueoiwekoynxgztzuihn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361105.0354462-439-155150012281463/AnsiballZ_copy.py'
Dec 10 10:05:05 compute-0 sudo[73097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:06 compute-0 python3.9[73099]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361105.0354462-439-155150012281463/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:06 compute-0 sudo[73097]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:06 compute-0 sudo[73249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbjuqlhzkftggszmautllgtbekvcrvjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361106.4043896-454-249616644714418/AnsiballZ_file.py'
Dec 10 10:05:06 compute-0 sudo[73249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:06 compute-0 python3.9[73251]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:06 compute-0 sudo[73249]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:07 compute-0 sudo[73401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubrdtcloutaqwbtiecwqtkzuwmqxsuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361107.0441704-462-258127006944980/AnsiballZ_command.py'
Dec 10 10:05:07 compute-0 sudo[73401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:07 compute-0 python3.9[73403]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:05:07 compute-0 sudo[73401]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:08 compute-0 sudo[73560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmpfjoaqizszsjsqpsxfftmmotenskky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361107.7651455-470-138138051268061/AnsiballZ_blockinfile.py'
Dec 10 10:05:08 compute-0 sudo[73560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:08 compute-0 python3.9[73562]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:08 compute-0 sudo[73560]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:08 compute-0 sudo[73713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srctgpdkndvmehgyzlirhpudxzwtazjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361108.66744-479-270816758826413/AnsiballZ_file.py'
Dec 10 10:05:08 compute-0 sudo[73713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:09 compute-0 python3.9[73715]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:09 compute-0 sudo[73713]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:09 compute-0 sudo[73865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxrwkwsvenumslrsrgtzvezbzpwamfqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361109.3659441-479-210189782999823/AnsiballZ_file.py'
Dec 10 10:05:09 compute-0 sudo[73865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:09 compute-0 python3.9[73867]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:09 compute-0 sudo[73865]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:10 compute-0 sudo[74017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ossmcszkmgxjnislirdcmanalogbpdvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361110.07059-494-76483577889265/AnsiballZ_mount.py'
Dec 10 10:05:10 compute-0 sudo[74017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:10 compute-0 python3.9[74019]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 10 10:05:10 compute-0 sudo[74017]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:10 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:05:10 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:05:11 compute-0 sudo[74171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwckwnwoubewbhuyayntwcedidrowlng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361110.9925156-494-139984652081555/AnsiballZ_mount.py'
Dec 10 10:05:11 compute-0 sudo[74171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:11 compute-0 python3.9[74173]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 10 10:05:11 compute-0 sudo[74171]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:11 compute-0 sshd-session[65012]: Connection closed by 192.168.122.30 port 51740
Dec 10 10:05:11 compute-0 sshd-session[65009]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:05:11 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 10 10:05:11 compute-0 systemd[1]: session-15.scope: Consumed 36.624s CPU time.
Dec 10 10:05:11 compute-0 systemd-logind[787]: Session 15 logged out. Waiting for processes to exit.
Dec 10 10:05:11 compute-0 systemd-logind[787]: Removed session 15.
Dec 10 10:05:17 compute-0 sshd-session[74199]: Accepted publickey for zuul from 192.168.122.30 port 46394 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:05:17 compute-0 systemd-logind[787]: New session 16 of user zuul.
Dec 10 10:05:17 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 10 10:05:17 compute-0 sshd-session[74199]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:05:18 compute-0 sudo[74352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clxdgnisxmpyrxspjbxnqytdiwssfdvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361117.6785836-16-64193522392928/AnsiballZ_tempfile.py'
Dec 10 10:05:18 compute-0 sudo[74352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:18 compute-0 python3.9[74354]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 10 10:05:18 compute-0 sudo[74352]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:18 compute-0 sudo[74504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipfumdfjfqfihqozliwkyilpgrqgmrht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361118.5378957-28-78801825517787/AnsiballZ_stat.py'
Dec 10 10:05:18 compute-0 sudo[74504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:19 compute-0 python3.9[74506]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:05:19 compute-0 sudo[74504]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:19 compute-0 sudo[74656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgcgmtgtlupsfztwzmwxthomtswjhut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361119.3348577-38-182026440035768/AnsiballZ_setup.py'
Dec 10 10:05:19 compute-0 sudo[74656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:20 compute-0 python3.9[74658]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:05:20 compute-0 sudo[74656]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:20 compute-0 sudo[74808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivnuryobcogzrwacovagskgxoafafusn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361120.4028234-47-53746530489955/AnsiballZ_blockinfile.py'
Dec 10 10:05:20 compute-0 sudo[74808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:21 compute-0 python3.9[74810]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDdhen7RpUQVWs0tOQi2Qjthc+noAD7YJllzWmH+d6gvCBnbvdQTwsLje12wGUMBNM4ObIs0sWhXzTZckhHOy5PAL8N+mikbMCIFBdHPgCG+fHYnnqM4dzhtZ8DuCYulCrbx5WRAbQJItKQK6YNGb5Ufd+kAkvmPZeYdxXHVX8S+aMMuzj6LL7Rxr4BXEFqHUT9IdBdokmjPuHXeHWbWW+pMM7cs7dhoEw6R+sQ/Sa3UkdbOkZsu173hG8e+i+mDWvcHcBGTf/RK14CDClQnn4WN97RTP/OGpBdyvdx/vSac4EmR+d0azTVjT3tcRsjh8wQ9JMeo9Vf5pKTpmAGBvfHre70Wkb8z5L17RXUZgbC9ePcC5V0KP4Qi4Zw51Dg2zjnlm8zNuZ1CuWdgxc3XXJCkpFsaB3jl+VPTeSexgWucBKM4uB7E3izRbDjVv1CH7paULUtUzSpn+uQPbbEfOSD1gpbFPxzF70luYkf4wXwVmChwkIupSFpl3Wm6T9/pSk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM6AvrnsdGQRe0fgGivWOBXhi8jLEWu1YuDP68CP19ms
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDUqScFNQ1kkt9TFgJne76pC9coAP4wHTSJwj9DEiANz2IDfI8Gt9NstdpT5iDATcfuFjW7/iMnCzMaCskSFMaQ=
                                             create=True mode=0644 path=/tmp/ansible.bj657z4x state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:21 compute-0 sudo[74808]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:21 compute-0 sudo[74960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlvfdycpvurnohbyjsvevoljrpaqrlyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361121.205431-55-96489089494913/AnsiballZ_command.py'
Dec 10 10:05:21 compute-0 sudo[74960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:21 compute-0 python3.9[74962]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bj657z4x' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:05:21 compute-0 sudo[74960]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:22 compute-0 sudo[75114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tamnrynoakrktbplzhcwuqtabxnlzwnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361122.00771-63-175335641163568/AnsiballZ_file.py'
Dec 10 10:05:22 compute-0 sudo[75114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:22 compute-0 python3.9[75116]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bj657z4x state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:22 compute-0 sudo[75114]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:22 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 10 10:05:23 compute-0 sshd-session[74202]: Connection closed by 192.168.122.30 port 46394
Dec 10 10:05:23 compute-0 sshd-session[74199]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:05:23 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 10 10:05:23 compute-0 systemd[1]: session-16.scope: Consumed 3.529s CPU time.
Dec 10 10:05:23 compute-0 systemd-logind[787]: Session 16 logged out. Waiting for processes to exit.
Dec 10 10:05:23 compute-0 systemd-logind[787]: Removed session 16.
Dec 10 10:05:29 compute-0 sshd-session[75144]: Accepted publickey for zuul from 192.168.122.30 port 37486 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:05:29 compute-0 systemd-logind[787]: New session 17 of user zuul.
Dec 10 10:05:29 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 10 10:05:29 compute-0 sshd-session[75144]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:05:30 compute-0 python3.9[75297]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:05:31 compute-0 sudo[75451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lizjgoiiqccydazxhtzaxzmjxubdvsay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361130.5504167-32-64696086987138/AnsiballZ_systemd.py'
Dec 10 10:05:31 compute-0 sudo[75451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:31 compute-0 python3.9[75453]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 10 10:05:31 compute-0 sudo[75451]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:31 compute-0 sudo[75605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbkkmqkfqfwkyupzuadbapaxdxfhqlpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361131.7026336-40-61458756896450/AnsiballZ_systemd.py'
Dec 10 10:05:31 compute-0 sudo[75605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:32 compute-0 python3.9[75607]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:05:32 compute-0 sudo[75605]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:32 compute-0 sudo[75758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yljulbollrftkvitafkfibqpbfcrsuxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361132.4889662-49-5013408258870/AnsiballZ_command.py'
Dec 10 10:05:32 compute-0 sudo[75758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:33 compute-0 python3.9[75760]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:05:33 compute-0 sudo[75758]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:33 compute-0 sudo[75911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhigcvmigvcidqyfgbyiyctgyfvckfxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361133.3747983-57-84927664634323/AnsiballZ_stat.py'
Dec 10 10:05:33 compute-0 sudo[75911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:33 compute-0 python3.9[75913]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:05:33 compute-0 sudo[75911]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:34 compute-0 sudo[76065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itmkfwlhbcvrthdndxpnqeoqbdvkjwsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361134.1270213-65-188606250396111/AnsiballZ_command.py'
Dec 10 10:05:34 compute-0 sudo[76065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:34 compute-0 python3.9[76067]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:05:34 compute-0 sudo[76065]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:35 compute-0 sudo[76220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytsbwnsvrtebcakbuxobxrdgiuyxnlos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361134.758376-73-1188127983676/AnsiballZ_file.py'
Dec 10 10:05:35 compute-0 sudo[76220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:35 compute-0 python3.9[76222]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:05:35 compute-0 sudo[76220]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:35 compute-0 sshd-session[75147]: Connection closed by 192.168.122.30 port 37486
Dec 10 10:05:35 compute-0 sshd-session[75144]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:05:35 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 10 10:05:35 compute-0 systemd[1]: session-17.scope: Consumed 4.400s CPU time.
Dec 10 10:05:35 compute-0 systemd-logind[787]: Session 17 logged out. Waiting for processes to exit.
Dec 10 10:05:35 compute-0 systemd-logind[787]: Removed session 17.
Dec 10 10:05:41 compute-0 sshd-session[76247]: Accepted publickey for zuul from 192.168.122.30 port 54414 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:05:41 compute-0 systemd-logind[787]: New session 18 of user zuul.
Dec 10 10:05:41 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 10 10:05:41 compute-0 sshd-session[76247]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:05:42 compute-0 python3.9[76400]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:05:43 compute-0 sudo[76554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfwfjpmunxyvwsugkpmoyodbjagqcln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361143.1138217-34-234350943915984/AnsiballZ_setup.py'
Dec 10 10:05:43 compute-0 sudo[76554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:43 compute-0 python3.9[76556]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:05:43 compute-0 sudo[76554]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:44 compute-0 sudo[76638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekyctdjzqzxywjghftnpvttsrpfgzxar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361143.1138217-34-234350943915984/AnsiballZ_dnf.py'
Dec 10 10:05:44 compute-0 sudo[76638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:44 compute-0 python3.9[76640]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 10 10:05:45 compute-0 sudo[76638]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:46 compute-0 python3.9[76791]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:05:47 compute-0 python3.9[76942]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:05:48 compute-0 python3.9[77092]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:05:49 compute-0 python3.9[77242]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:05:49 compute-0 sshd-session[76250]: Connection closed by 192.168.122.30 port 54414
Dec 10 10:05:49 compute-0 sshd-session[76247]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:05:49 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 10 10:05:49 compute-0 systemd[1]: session-18.scope: Consumed 6.048s CPU time.
Dec 10 10:05:49 compute-0 systemd-logind[787]: Session 18 logged out. Waiting for processes to exit.
Dec 10 10:05:49 compute-0 systemd-logind[787]: Removed session 18.
Dec 10 10:05:55 compute-0 sshd-session[77267]: Accepted publickey for zuul from 192.168.122.30 port 58850 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:05:55 compute-0 systemd-logind[787]: New session 19 of user zuul.
Dec 10 10:05:55 compute-0 systemd[1]: Started Session 19 of User zuul.
Dec 10 10:05:55 compute-0 sshd-session[77267]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:05:56 compute-0 python3.9[77420]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:05:57 compute-0 sudo[77574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uenwhrmhnmiftpjmgvqwlhctkfhyrtgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361157.2264807-50-275705752830499/AnsiballZ_file.py'
Dec 10 10:05:57 compute-0 sudo[77574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:57 compute-0 python3.9[77576]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:05:57 compute-0 sudo[77574]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:58 compute-0 sudo[77726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhoqpfjwplegcfyuvgdbvqzolzwmijip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361157.9947975-50-85535514680223/AnsiballZ_file.py'
Dec 10 10:05:58 compute-0 sudo[77726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:58 compute-0 python3.9[77728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:05:58 compute-0 sudo[77726]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:59 compute-0 sudo[77878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdfjxhbedtwsfgdvjktsesedttzjfueh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361158.6263406-65-33660595174635/AnsiballZ_stat.py'
Dec 10 10:05:59 compute-0 sudo[77878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:05:59 compute-0 python3.9[77880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:05:59 compute-0 sudo[77878]: pam_unix(sudo:session): session closed for user root
Dec 10 10:05:59 compute-0 sudo[78001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knuqscgbpnefkkgzvwowydsletwxlabv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361158.6263406-65-33660595174635/AnsiballZ_copy.py'
Dec 10 10:05:59 compute-0 sudo[78001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:00 compute-0 python3.9[78003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361158.6263406-65-33660595174635/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bc1babd9079bb0488aab631220edc6bea462c57a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:00 compute-0 sudo[78001]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:00 compute-0 sudo[78153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqtkennohbkboxtomqzvnremxkdjluoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361160.2215514-65-274841616460858/AnsiballZ_stat.py'
Dec 10 10:06:00 compute-0 sudo[78153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:00 compute-0 python3.9[78155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:00 compute-0 sudo[78153]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:01 compute-0 sudo[78276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztjxktlaiarnqhrftqnvnwnhhllrfham ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361160.2215514-65-274841616460858/AnsiballZ_copy.py'
Dec 10 10:06:01 compute-0 sudo[78276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:01 compute-0 python3.9[78278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361160.2215514-65-274841616460858/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=4bcedbd91374c99bafaa0873b956f4a2760d3bb0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:01 compute-0 sudo[78276]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:01 compute-0 sudo[78428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otldozrontiweuuqwrdvmktgxbbkpsae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361161.4512832-65-89394388715904/AnsiballZ_stat.py'
Dec 10 10:06:01 compute-0 sudo[78428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:01 compute-0 python3.9[78430]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:01 compute-0 sudo[78428]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:02 compute-0 sudo[78551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzojduitkptlqmlibhlidbnitkoadrfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361161.4512832-65-89394388715904/AnsiballZ_copy.py'
Dec 10 10:06:02 compute-0 sudo[78551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:02 compute-0 python3.9[78553]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361161.4512832-65-89394388715904/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a1059023f744750b8c5b6ae79d977e39ed46ef31 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:02 compute-0 sudo[78551]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:03 compute-0 sudo[78703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhosmivygwaktuoywjhlagrmkgdzmmfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361162.724468-109-262914732125272/AnsiballZ_file.py'
Dec 10 10:06:03 compute-0 sudo[78703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:03 compute-0 python3.9[78705]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:03 compute-0 sudo[78703]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:03 compute-0 sudo[78855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiltfbrqlxfpwgzhhcygwvgzjrwyzwmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361163.3530977-109-207006976257067/AnsiballZ_file.py'
Dec 10 10:06:03 compute-0 sudo[78855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:03 compute-0 python3.9[78857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:03 compute-0 sudo[78855]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:04 compute-0 sudo[79007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecipzpghotqgjarpjljgkdseycxgkkjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361164.0053904-124-247491111069740/AnsiballZ_stat.py'
Dec 10 10:06:04 compute-0 sudo[79007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:04 compute-0 python3.9[79009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:04 compute-0 sudo[79007]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:05 compute-0 sudo[79130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skagvkrngtkpibrsfzzzimfbxmczzmyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361164.0053904-124-247491111069740/AnsiballZ_copy.py'
Dec 10 10:06:05 compute-0 sudo[79130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:05 compute-0 python3.9[79132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361164.0053904-124-247491111069740/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=3ef2101d8315c3fa6841aa8af127f14ab7fabf63 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:05 compute-0 sudo[79130]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:06 compute-0 sudo[79282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppqqbrdgrmsxbhqasgnmiopwckycihzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361165.9938166-124-234744257187043/AnsiballZ_stat.py'
Dec 10 10:06:06 compute-0 sudo[79282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:06 compute-0 python3.9[79284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:06 compute-0 sudo[79282]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:06 compute-0 sudo[79405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xypjmadrgaprhuwmvoaxzzwhsjplremq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361165.9938166-124-234744257187043/AnsiballZ_copy.py'
Dec 10 10:06:06 compute-0 sudo[79405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:07 compute-0 python3.9[79407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361165.9938166-124-234744257187043/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f3a035168f95275862487939530607390a4e4808 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:07 compute-0 sudo[79405]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:07 compute-0 sudo[79557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuiizhpydpbyegcdkbpuksgqjnynemnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361167.2162104-124-149604100642393/AnsiballZ_stat.py'
Dec 10 10:06:07 compute-0 sudo[79557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:07 compute-0 python3.9[79559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:07 compute-0 sudo[79557]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:08 compute-0 sudo[79680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rivdcouvcmdhkricmadafblaggsusobk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361167.2162104-124-149604100642393/AnsiballZ_copy.py'
Dec 10 10:06:08 compute-0 sudo[79680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:08 compute-0 python3.9[79682]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361167.2162104-124-149604100642393/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f68471f77a7259040624566444ad23123d04bb14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:08 compute-0 sudo[79680]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:08 compute-0 sudo[79832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfkswiemavwezhibuwftnngklehbtdoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361168.4617505-168-216963546056647/AnsiballZ_file.py'
Dec 10 10:06:08 compute-0 sudo[79832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:08 compute-0 python3.9[79834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:08 compute-0 sudo[79832]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:09 compute-0 sudo[79984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdvxdolvknhjjpyqjfcwzsrxtetbtsus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361169.0882263-168-27725495005485/AnsiballZ_file.py'
Dec 10 10:06:09 compute-0 sudo[79984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:09 compute-0 python3.9[79986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:09 compute-0 sudo[79984]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:10 compute-0 sudo[80136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmmuhspfmounjskoxngtunhypwtvqccc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361169.832752-183-69420405023892/AnsiballZ_stat.py'
Dec 10 10:06:10 compute-0 sudo[80136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:10 compute-0 python3.9[80138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:10 compute-0 sudo[80136]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:10 compute-0 sudo[80259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raeeegprrlzqmhbfdlxtrbitibtldvja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361169.832752-183-69420405023892/AnsiballZ_copy.py'
Dec 10 10:06:10 compute-0 sudo[80259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:10 compute-0 python3.9[80261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361169.832752-183-69420405023892/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e20e52ca2ec48db55181bb6843a1b3b408ac2a3d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:10 compute-0 sudo[80259]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:11 compute-0 sudo[80411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fonbtlxuzztzvlahlaaqvybfcftoglvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361170.9643288-183-70753885663369/AnsiballZ_stat.py'
Dec 10 10:06:11 compute-0 sudo[80411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:11 compute-0 python3.9[80413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:11 compute-0 sudo[80411]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:11 compute-0 sudo[80534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyntcxeyixliqmqzhfzxtglkwlcyhjtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361170.9643288-183-70753885663369/AnsiballZ_copy.py'
Dec 10 10:06:11 compute-0 sudo[80534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:11 compute-0 python3.9[80536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361170.9643288-183-70753885663369/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=58163b16fd8caa9e1bb7c743495394e99877995c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:11 compute-0 sudo[80534]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:12 compute-0 sudo[80686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htevrdisfrfzuxxckrpzvbdmoetlxdhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361172.0709934-183-141210314942107/AnsiballZ_stat.py'
Dec 10 10:06:12 compute-0 sudo[80686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:12 compute-0 python3.9[80688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:12 compute-0 sudo[80686]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:12 compute-0 sudo[80809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sggeydmoflqswccksjmvzloqaifptnmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361172.0709934-183-141210314942107/AnsiballZ_copy.py'
Dec 10 10:06:12 compute-0 sudo[80809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:13 compute-0 python3.9[80811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361172.0709934-183-141210314942107/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3d795fc4dd3294f0cf16715cbc64d718f8b76291 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:13 compute-0 sudo[80809]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:13 compute-0 sudo[80961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fujlfawhyelleoyurehjxvgsladfbtrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361173.3637674-227-251144506086901/AnsiballZ_file.py'
Dec 10 10:06:13 compute-0 sudo[80961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:13 compute-0 python3.9[80963]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:13 compute-0 sudo[80961]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:14 compute-0 sudo[81113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utlyhfwggnyxdejzvjljqvldgxdwniwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361174.0374644-227-162979270911943/AnsiballZ_file.py'
Dec 10 10:06:14 compute-0 sudo[81113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:14 compute-0 python3.9[81115]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:14 compute-0 sudo[81113]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:14 compute-0 sudo[81265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sasxyezpvqdzmuehafpxjxyoivihgrmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361174.634476-242-61793639767507/AnsiballZ_stat.py'
Dec 10 10:06:14 compute-0 sudo[81265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:15 compute-0 python3.9[81267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:15 compute-0 sudo[81265]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:15 compute-0 sudo[81388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaotkyogqjrgtuxxziexemuwqcxroqtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361174.634476-242-61793639767507/AnsiballZ_copy.py'
Dec 10 10:06:15 compute-0 sudo[81388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:15 compute-0 python3.9[81390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361174.634476-242-61793639767507/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=23a7739b43e0a1e184d981cfa081f68876b7bb14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:15 compute-0 sudo[81388]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:16 compute-0 sudo[81540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvwivkifbesllhbmdfzubqnvypulygh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361175.7930984-242-226869617559102/AnsiballZ_stat.py'
Dec 10 10:06:16 compute-0 sudo[81540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:16 compute-0 python3.9[81542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:16 compute-0 sudo[81540]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:16 compute-0 sudo[81663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwkxzvtixxgmizodyehubsttucixogbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361175.7930984-242-226869617559102/AnsiballZ_copy.py'
Dec 10 10:06:16 compute-0 sudo[81663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:16 compute-0 python3.9[81665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361175.7930984-242-226869617559102/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=58163b16fd8caa9e1bb7c743495394e99877995c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:16 compute-0 sudo[81663]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:17 compute-0 sudo[81815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvyutyywtouidhzbljtynytnzeciisal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361176.946216-242-109798583844726/AnsiballZ_stat.py'
Dec 10 10:06:17 compute-0 sudo[81815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:17 compute-0 python3.9[81817]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:17 compute-0 sudo[81815]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:17 compute-0 sudo[81938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wszokmbezaczmwecmwuvlbbycsehzeef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361176.946216-242-109798583844726/AnsiballZ_copy.py'
Dec 10 10:06:17 compute-0 sudo[81938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:17 compute-0 python3.9[81940]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361176.946216-242-109798583844726/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0c7243c9a91f0d16e976185c288b92317c83e5b8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:17 compute-0 sudo[81938]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:18 compute-0 sudo[82090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txhkbffszbhbethbmoaslwjcylnfhjjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361178.6410816-302-33179926619401/AnsiballZ_file.py'
Dec 10 10:06:18 compute-0 sudo[82090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:19 compute-0 python3.9[82092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:19 compute-0 sudo[82090]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:19 compute-0 sudo[82242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxphpshfoynfeawolxtrsmijmjtkwvnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361179.3110108-310-191190215437864/AnsiballZ_stat.py'
Dec 10 10:06:19 compute-0 sudo[82242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:19 compute-0 python3.9[82244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:19 compute-0 sudo[82242]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:20 compute-0 sudo[82365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmxxrldfrqgzanfgsemfqkymxirjuxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361179.3110108-310-191190215437864/AnsiballZ_copy.py'
Dec 10 10:06:20 compute-0 sudo[82365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:20 compute-0 python3.9[82367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361179.3110108-310-191190215437864/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:20 compute-0 sudo[82365]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:20 compute-0 irqbalance[781]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 10 10:06:20 compute-0 irqbalance[781]: IRQ 26 affinity is now unmanaged
Dec 10 10:06:20 compute-0 sudo[82517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxumesszzsmlwqdyxajvppoefslyduic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361180.5670416-326-185364314374221/AnsiballZ_file.py'
Dec 10 10:06:20 compute-0 sudo[82517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:21 compute-0 python3.9[82519]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:21 compute-0 sudo[82517]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:21 compute-0 sudo[82669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyhvmvfsjtfvayqczdmiqimdxyiihuft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361181.2461085-334-225127879130895/AnsiballZ_stat.py'
Dec 10 10:06:21 compute-0 sudo[82669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:21 compute-0 python3.9[82671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:21 compute-0 sudo[82669]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:22 compute-0 sudo[82792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giadhrvwlsvxgzpguzzifvhbugglzpro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361181.2461085-334-225127879130895/AnsiballZ_copy.py'
Dec 10 10:06:22 compute-0 sudo[82792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:22 compute-0 python3.9[82794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361181.2461085-334-225127879130895/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:22 compute-0 sudo[82792]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:22 compute-0 sudo[82944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvmlwvgnxlqmxqlyhfxgrpoqlxsghngm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361182.4364398-350-120284459535866/AnsiballZ_file.py'
Dec 10 10:06:22 compute-0 sudo[82944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:22 compute-0 python3.9[82946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:22 compute-0 sudo[82944]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:23 compute-0 sudo[83096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqklhsbwirjiarsusgwbplqryjjbebkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361183.0412862-358-26551908797117/AnsiballZ_stat.py'
Dec 10 10:06:23 compute-0 sudo[83096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:23 compute-0 python3.9[83098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:23 compute-0 sudo[83096]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:23 compute-0 sudo[83219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-betxvxuytgqclxptczzxiwrwbtotfqqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361183.0412862-358-26551908797117/AnsiballZ_copy.py'
Dec 10 10:06:23 compute-0 sudo[83219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:23 compute-0 python3.9[83221]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361183.0412862-358-26551908797117/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:24 compute-0 sudo[83219]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:24 compute-0 sudo[83371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abtthperpmnippbtfnotkjirmsvmelua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361184.2124155-374-116081619818638/AnsiballZ_file.py'
Dec 10 10:06:24 compute-0 sudo[83371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:24 compute-0 python3.9[83373]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:24 compute-0 sudo[83371]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:25 compute-0 chronyd[64983]: Selected source 162.159.200.123 (pool.ntp.org)
Dec 10 10:06:25 compute-0 sudo[83523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvfttizspxjtpnanqrqrosuqwjnjzded ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361184.8655524-382-275698267375215/AnsiballZ_stat.py'
Dec 10 10:06:25 compute-0 sudo[83523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:25 compute-0 python3.9[83525]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:25 compute-0 sudo[83523]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:25 compute-0 sudo[83646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhlfpzrszkgcchkgliwryyohssuahmzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361184.8655524-382-275698267375215/AnsiballZ_copy.py'
Dec 10 10:06:25 compute-0 sudo[83646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:25 compute-0 python3.9[83648]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361184.8655524-382-275698267375215/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:25 compute-0 sudo[83646]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:26 compute-0 sudo[83798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlyypxhrxyrwxexzaztqxhqrxuaecplx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361186.1821573-398-121074188700617/AnsiballZ_file.py'
Dec 10 10:06:26 compute-0 sudo[83798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:26 compute-0 python3.9[83800]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:26 compute-0 sudo[83798]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:27 compute-0 sudo[83950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoeahwllyvtozjxzsjylzsscjmmdpfzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361186.8346734-406-148083199496955/AnsiballZ_stat.py'
Dec 10 10:06:27 compute-0 sudo[83950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:27 compute-0 python3.9[83952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:27 compute-0 sudo[83950]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:27 compute-0 sudo[84073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jypydgyjycoxqcdayqumzwqyhiccbtiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361186.8346734-406-148083199496955/AnsiballZ_copy.py'
Dec 10 10:06:27 compute-0 sudo[84073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:27 compute-0 python3.9[84075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361186.8346734-406-148083199496955/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:27 compute-0 sudo[84073]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:28 compute-0 sudo[84225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nauzjujfpfvfrxbyhnzqaqdnijyotxic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361188.075519-422-219431069977572/AnsiballZ_file.py'
Dec 10 10:06:28 compute-0 sudo[84225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:28 compute-0 python3.9[84227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:28 compute-0 sudo[84225]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:29 compute-0 sudo[84377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppydpenhzcomrzypwdfnzbmrcwikpbtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361188.7042048-430-19986512816684/AnsiballZ_stat.py'
Dec 10 10:06:29 compute-0 sudo[84377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:29 compute-0 python3.9[84379]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:29 compute-0 sudo[84377]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:29 compute-0 sudo[84500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufitjusqzfbrcyvopalftdyhwovfhmdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361188.7042048-430-19986512816684/AnsiballZ_copy.py'
Dec 10 10:06:29 compute-0 sudo[84500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:29 compute-0 python3.9[84502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361188.7042048-430-19986512816684/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:29 compute-0 sudo[84500]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:30 compute-0 sudo[84652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvyukdevkpdzdvnbhgwivegdjjdhynsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361190.048789-446-226179261738119/AnsiballZ_file.py'
Dec 10 10:06:30 compute-0 sudo[84652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:30 compute-0 python3.9[84654]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:30 compute-0 sudo[84652]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:31 compute-0 sudo[84804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxyrhfsdttmsvdaltsjtwtnovluzuzrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361190.7662091-454-170033240577602/AnsiballZ_stat.py'
Dec 10 10:06:31 compute-0 sudo[84804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:31 compute-0 python3.9[84806]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:31 compute-0 sudo[84804]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:31 compute-0 sudo[84927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvpdisweieqmaluajegsyxyapusicmnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361190.7662091-454-170033240577602/AnsiballZ_copy.py'
Dec 10 10:06:31 compute-0 sudo[84927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:31 compute-0 python3.9[84929]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361190.7662091-454-170033240577602/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=21a1e9e6cd9583f67d50b5fc30bf05f5f214a4e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:31 compute-0 sudo[84927]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:32 compute-0 sshd-session[77270]: Connection closed by 192.168.122.30 port 58850
Dec 10 10:06:32 compute-0 sshd-session[77267]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:06:32 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 10 10:06:32 compute-0 systemd[1]: session-19.scope: Consumed 29.754s CPU time.
Dec 10 10:06:32 compute-0 systemd-logind[787]: Session 19 logged out. Waiting for processes to exit.
Dec 10 10:06:32 compute-0 systemd-logind[787]: Removed session 19.
Dec 10 10:06:37 compute-0 sshd-session[84954]: Accepted publickey for zuul from 192.168.122.30 port 56284 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:06:37 compute-0 systemd-logind[787]: New session 20 of user zuul.
Dec 10 10:06:37 compute-0 systemd[1]: Started Session 20 of User zuul.
Dec 10 10:06:37 compute-0 sshd-session[84954]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:06:38 compute-0 python3.9[85107]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:06:39 compute-0 sudo[85261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvjlvvtncnlsheeruniknbufhbnnzlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361199.1365502-34-248380513097714/AnsiballZ_file.py'
Dec 10 10:06:39 compute-0 sudo[85261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:39 compute-0 python3.9[85263]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:39 compute-0 sudo[85261]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:40 compute-0 sudo[85413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llukshiuovzjqqgyorolyyhgdidtbnpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361199.8702915-34-61154017745253/AnsiballZ_file.py'
Dec 10 10:06:40 compute-0 sudo[85413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:40 compute-0 python3.9[85415]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:06:40 compute-0 sudo[85413]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:41 compute-0 python3.9[85565]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:06:41 compute-0 sudo[85715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svghtihmtcuxbxjswlujngznqurvtcvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361201.2728171-57-54284097743201/AnsiballZ_seboolean.py'
Dec 10 10:06:41 compute-0 sudo[85715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:41 compute-0 python3.9[85717]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 10 10:06:43 compute-0 sudo[85715]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:43 compute-0 sudo[85871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xezittpirsxaokvjkebtpepljnxyysic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361203.3238864-67-170212567017725/AnsiballZ_setup.py'
Dec 10 10:06:43 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 10 10:06:43 compute-0 sudo[85871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:43 compute-0 python3.9[85873]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:06:44 compute-0 sudo[85871]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:44 compute-0 sudo[85955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weonttbedsptwkluqnornmtdtlblhqhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361203.3238864-67-170212567017725/AnsiballZ_dnf.py'
Dec 10 10:06:44 compute-0 sudo[85955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:44 compute-0 python3.9[85957]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:06:46 compute-0 sudo[85955]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:46 compute-0 sudo[86108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-comfslitwdgpttwovuimifzlnpjupjqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361206.2868044-79-15507926745641/AnsiballZ_systemd.py'
Dec 10 10:06:46 compute-0 sudo[86108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:47 compute-0 python3.9[86110]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:06:47 compute-0 sudo[86108]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:47 compute-0 sudo[86263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqoohxewkgbpuvzwofsfniugusapglqy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361207.3927178-87-109466099731900/AnsiballZ_edpm_nftables_snippet.py'
Dec 10 10:06:47 compute-0 sudo[86263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:48 compute-0 python3[86265]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 10 10:06:48 compute-0 sudo[86263]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:48 compute-0 sudo[86415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdwtreuwnpeudawsnfbfnrvsdzhpnekg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361208.2864559-96-139198789723676/AnsiballZ_file.py'
Dec 10 10:06:48 compute-0 sudo[86415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:48 compute-0 python3.9[86417]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:48 compute-0 sudo[86415]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:49 compute-0 sudo[86567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcxqmzpjrnpxwjmlnosaznepbxlenaiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361208.9011605-104-135803434991528/AnsiballZ_stat.py'
Dec 10 10:06:49 compute-0 sudo[86567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:49 compute-0 python3.9[86569]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:49 compute-0 sudo[86567]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:49 compute-0 sudo[86645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcsbnnfaectpdsgdmqvrqikwjsoubaos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361208.9011605-104-135803434991528/AnsiballZ_file.py'
Dec 10 10:06:49 compute-0 sudo[86645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:49 compute-0 python3.9[86647]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:50 compute-0 sudo[86645]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:50 compute-0 sudo[86797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoulacgnvrhbaeapkplpokpxoavqdsff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361210.1532443-116-151716994610551/AnsiballZ_stat.py'
Dec 10 10:06:50 compute-0 sudo[86797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:50 compute-0 python3.9[86799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:50 compute-0 sudo[86797]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:50 compute-0 sudo[86875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlobhpwlbvicsxmynroqtuvnyexxaxke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361210.1532443-116-151716994610551/AnsiballZ_file.py'
Dec 10 10:06:50 compute-0 sudo[86875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:51 compute-0 python3.9[86877]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ny03ov3q recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:51 compute-0 sudo[86875]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:51 compute-0 sudo[87027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ognzsrnblitzjffugcyssocqvnekpvbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361211.3035996-128-247768792381139/AnsiballZ_stat.py'
Dec 10 10:06:51 compute-0 sudo[87027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:51 compute-0 python3.9[87029]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:51 compute-0 sudo[87027]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:52 compute-0 sudo[87105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrowjbtafpuvcdqmkgmnrjhbtedbverp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361211.3035996-128-247768792381139/AnsiballZ_file.py'
Dec 10 10:06:52 compute-0 sudo[87105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:52 compute-0 python3.9[87107]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:52 compute-0 sudo[87105]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:52 compute-0 sudo[87257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxttpzbyncrcooougmascyygtvpwezx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361212.5047066-141-225974925952068/AnsiballZ_command.py'
Dec 10 10:06:52 compute-0 sudo[87257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:53 compute-0 python3.9[87259]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:06:53 compute-0 sudo[87257]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:53 compute-0 sudo[87410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsvedstjxrzyohkfgjhhceshtjdzmxbm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361213.3301415-149-128102366174521/AnsiballZ_edpm_nftables_from_files.py'
Dec 10 10:06:53 compute-0 sudo[87410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:53 compute-0 python3[87412]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 10 10:06:53 compute-0 sudo[87410]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:54 compute-0 sudo[87562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfnqpngtzswsobwxtdhwplqkfgchyqjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361214.1063766-157-140967620686711/AnsiballZ_stat.py'
Dec 10 10:06:54 compute-0 sudo[87562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:54 compute-0 python3.9[87564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:54 compute-0 sudo[87562]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:55 compute-0 sudo[87687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnefqexwepymmkeufrxrcyeugumrveky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361214.1063766-157-140967620686711/AnsiballZ_copy.py'
Dec 10 10:06:55 compute-0 sudo[87687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:55 compute-0 python3.9[87689]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361214.1063766-157-140967620686711/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:55 compute-0 sudo[87687]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:55 compute-0 sudo[87839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifywmjpssiomnixuoaxkzgaxnjcdyegw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361215.6488118-172-168854162003615/AnsiballZ_stat.py'
Dec 10 10:06:55 compute-0 sudo[87839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:56 compute-0 python3.9[87841]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:56 compute-0 sudo[87839]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:56 compute-0 sudo[87964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlpflbvvpbqjadzaapmkiaiohfngjvzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361215.6488118-172-168854162003615/AnsiballZ_copy.py'
Dec 10 10:06:56 compute-0 sudo[87964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:56 compute-0 python3.9[87966]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361215.6488118-172-168854162003615/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:56 compute-0 sudo[87964]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:57 compute-0 sudo[88116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imqnsdvehqnpmzvkhxjxdcgmijirliro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361217.036073-187-271801537008084/AnsiballZ_stat.py'
Dec 10 10:06:57 compute-0 sudo[88116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:57 compute-0 python3.9[88118]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:57 compute-0 sudo[88116]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:57 compute-0 sudo[88241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfrkilkvdgcdpjcllpaoncggstvkzgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361217.036073-187-271801537008084/AnsiballZ_copy.py'
Dec 10 10:06:57 compute-0 sudo[88241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:58 compute-0 python3.9[88243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361217.036073-187-271801537008084/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:58 compute-0 sudo[88241]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:58 compute-0 sudo[88393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoikujwhlonngmwfmniciaxnjcbyisbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361218.3491955-202-3811296186432/AnsiballZ_stat.py'
Dec 10 10:06:58 compute-0 sudo[88393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:58 compute-0 python3.9[88395]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:06:58 compute-0 sudo[88393]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:59 compute-0 sudo[88518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxhfrkfpafvgwdegzobzkcjezwoxsgmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361218.3491955-202-3811296186432/AnsiballZ_copy.py'
Dec 10 10:06:59 compute-0 sudo[88518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:06:59 compute-0 python3.9[88520]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361218.3491955-202-3811296186432/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:06:59 compute-0 sudo[88518]: pam_unix(sudo:session): session closed for user root
Dec 10 10:06:59 compute-0 sudo[88670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjuvpvqvfhooruggrbekrjwpgxavajoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361219.6123927-217-107467401341326/AnsiballZ_stat.py'
Dec 10 10:06:59 compute-0 sudo[88670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:00 compute-0 python3.9[88672]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:00 compute-0 sudo[88670]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:00 compute-0 sudo[88795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bigqoofklmflebhjxvelcoroysfyzmon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361219.6123927-217-107467401341326/AnsiballZ_copy.py'
Dec 10 10:07:00 compute-0 sudo[88795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:00 compute-0 python3.9[88797]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361219.6123927-217-107467401341326/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:00 compute-0 sudo[88795]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:01 compute-0 sudo[88947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidmdksdizrztjewrlufagqbqmocdnqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361220.9422553-232-68410921851140/AnsiballZ_file.py'
Dec 10 10:07:01 compute-0 sudo[88947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:01 compute-0 python3.9[88949]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:01 compute-0 sudo[88947]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:01 compute-0 sudo[89099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxulxeoqiinkraqpupfyikefxnqbzppp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361221.6145027-240-208523225851630/AnsiballZ_command.py'
Dec 10 10:07:01 compute-0 sudo[89099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:02 compute-0 python3.9[89101]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:02 compute-0 sudo[89099]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:02 compute-0 sudo[89254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwvbkgbihvgllkeqwgrstmkkdsglisjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361222.259276-248-15651654077396/AnsiballZ_blockinfile.py'
Dec 10 10:07:02 compute-0 sudo[89254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:02 compute-0 python3.9[89256]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:02 compute-0 sudo[89254]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:03 compute-0 sudo[89406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soolgjqyybnotbiewvoxugclyvdcrbrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361223.0905282-257-149856266116185/AnsiballZ_command.py'
Dec 10 10:07:03 compute-0 sudo[89406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:03 compute-0 python3.9[89408]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:03 compute-0 sudo[89406]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:04 compute-0 sudo[89560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezukfldukbiiqraoymsmdjjtcdnubrjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361223.7415822-265-8412293322057/AnsiballZ_stat.py'
Dec 10 10:07:04 compute-0 sudo[89560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:04 compute-0 python3.9[89562]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:07:04 compute-0 sudo[89560]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:05 compute-0 sudo[89714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buebvpezpbfiatgrytclpjmjllpwymla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361224.8261359-273-199175121798420/AnsiballZ_command.py'
Dec 10 10:07:05 compute-0 sudo[89714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:05 compute-0 python3.9[89716]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:05 compute-0 sudo[89714]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:05 compute-0 sudo[89869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpjznbvemxydkoefsedodayqiexfhnow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361225.477471-281-84947833661331/AnsiballZ_file.py'
Dec 10 10:07:05 compute-0 sudo[89869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:05 compute-0 python3.9[89871]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:06 compute-0 sudo[89869]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:07 compute-0 python3.9[90021]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:07:08 compute-0 sudo[90172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghxclxdpfumozcimufyvbspjnylrufoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361227.8148148-321-272878923042832/AnsiballZ_command.py'
Dec 10 10:07:08 compute-0 sudo[90172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:08 compute-0 python3.9[90174]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:cb:58:d7:dd" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:08 compute-0 ovs-vsctl[90175]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:cb:58:d7:dd external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 10 10:07:08 compute-0 sudo[90172]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:08 compute-0 sudo[90325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcoscmtaaxqobdgjsqtyvqhyxmyqqbok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361228.463401-330-237763891275435/AnsiballZ_command.py'
Dec 10 10:07:08 compute-0 sudo[90325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:08 compute-0 python3.9[90327]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:08 compute-0 sudo[90325]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:09 compute-0 sudo[90480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmirrwaqzgbgecysbqqpeanyidtvjtrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361229.075388-338-62373242548270/AnsiballZ_command.py'
Dec 10 10:07:09 compute-0 sudo[90480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:09 compute-0 python3.9[90482]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:09 compute-0 ovs-vsctl[90483]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 10 10:07:09 compute-0 sudo[90480]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:10 compute-0 python3.9[90633]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:07:10 compute-0 sudo[90785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gprqwqadegeogthhbeozmngfuxczdsaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361230.3665752-355-153806543540455/AnsiballZ_file.py'
Dec 10 10:07:10 compute-0 sudo[90785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:10 compute-0 python3.9[90787]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:10 compute-0 sudo[90785]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:11 compute-0 sudo[90937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndyppcqqtubwmflskwntcthzvfmzivum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361231.0215821-363-20076282840514/AnsiballZ_stat.py'
Dec 10 10:07:11 compute-0 sudo[90937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:11 compute-0 python3.9[90939]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:11 compute-0 sudo[90937]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:11 compute-0 sudo[91015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuyirknbpedvspnahawkoixulziywmty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361231.0215821-363-20076282840514/AnsiballZ_file.py'
Dec 10 10:07:11 compute-0 sudo[91015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:11 compute-0 python3.9[91017]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:12 compute-0 sudo[91015]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:12 compute-0 sudo[91167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkaleseumrymaulwaaoutajloroiwoev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361232.11482-363-45552472028902/AnsiballZ_stat.py'
Dec 10 10:07:12 compute-0 sudo[91167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:12 compute-0 python3.9[91169]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:12 compute-0 sudo[91167]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:12 compute-0 sudo[91245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crqaazobivlrvhmsjkxircygesrsgfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361232.11482-363-45552472028902/AnsiballZ_file.py'
Dec 10 10:07:12 compute-0 sudo[91245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:13 compute-0 python3.9[91247]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:13 compute-0 sudo[91245]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:13 compute-0 sudo[91397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txrvvimolvdcrdsajhilfchvumxbotkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361233.2409687-386-122242136542512/AnsiballZ_file.py'
Dec 10 10:07:13 compute-0 sudo[91397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:13 compute-0 python3.9[91399]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:13 compute-0 sudo[91397]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:14 compute-0 sudo[91549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-behfsawuvbjnqrrtsgtmrcmfoifjtxos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361233.8730843-394-34055992446038/AnsiballZ_stat.py'
Dec 10 10:07:14 compute-0 sudo[91549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:14 compute-0 python3.9[91551]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:14 compute-0 sudo[91549]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:14 compute-0 sudo[91627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brtirbvypxyscujhcithfobfbfvgrfkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361233.8730843-394-34055992446038/AnsiballZ_file.py'
Dec 10 10:07:14 compute-0 sudo[91627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:14 compute-0 python3.9[91629]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:14 compute-0 sudo[91627]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:15 compute-0 sudo[91779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usesypdcjpkpnatfujsomynmusdgcafe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361235.060634-406-5283211250699/AnsiballZ_stat.py'
Dec 10 10:07:15 compute-0 sudo[91779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:15 compute-0 python3.9[91781]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:15 compute-0 sudo[91779]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:15 compute-0 sudo[91857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcrqbyequzjmrkrlvbtbygvtslwhfxxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361235.060634-406-5283211250699/AnsiballZ_file.py'
Dec 10 10:07:15 compute-0 sudo[91857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:16 compute-0 python3.9[91859]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:16 compute-0 sudo[91857]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:16 compute-0 sudo[92009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytzuvltklwwybhwtnlyzkmdjjqykgaha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361236.1744087-418-214837861787227/AnsiballZ_systemd.py'
Dec 10 10:07:16 compute-0 sudo[92009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:16 compute-0 python3.9[92011]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:07:16 compute-0 systemd[1]: Reloading.
Dec 10 10:07:16 compute-0 systemd-rc-local-generator[92039]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:07:16 compute-0 systemd-sysv-generator[92042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:07:17 compute-0 sudo[92009]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:17 compute-0 sudo[92198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vimhigjdslivrslvysulanmlzxybdmfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361237.226105-426-199283231795668/AnsiballZ_stat.py'
Dec 10 10:07:17 compute-0 sudo[92198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:17 compute-0 python3.9[92200]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:17 compute-0 sudo[92198]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:17 compute-0 sudo[92276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsruoxabzplgiibquffileqdiesuuzdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361237.226105-426-199283231795668/AnsiballZ_file.py'
Dec 10 10:07:17 compute-0 sudo[92276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:18 compute-0 python3.9[92278]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:18 compute-0 sudo[92276]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:18 compute-0 sudo[92428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdfxdbzxujdgnhddwedjlzbohfuryot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361238.302253-438-121750240488975/AnsiballZ_stat.py'
Dec 10 10:07:18 compute-0 sudo[92428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:18 compute-0 python3.9[92430]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:18 compute-0 sudo[92428]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:19 compute-0 sudo[92506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcmftxlazqrbxehiyfoynccfdagwxufq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361238.302253-438-121750240488975/AnsiballZ_file.py'
Dec 10 10:07:19 compute-0 sudo[92506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:19 compute-0 python3.9[92508]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:19 compute-0 sudo[92506]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:19 compute-0 sudo[92658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkfizuohiqrktnerdryblbrohigmtvuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361239.494876-450-109064789375994/AnsiballZ_systemd.py'
Dec 10 10:07:19 compute-0 sudo[92658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:20 compute-0 python3.9[92660]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:07:20 compute-0 systemd[1]: Reloading.
Dec 10 10:07:20 compute-0 systemd-sysv-generator[92689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:07:20 compute-0 systemd-rc-local-generator[92683]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:07:20 compute-0 systemd[1]: Starting Create netns directory...
Dec 10 10:07:20 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 10 10:07:20 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 10 10:07:20 compute-0 systemd[1]: Finished Create netns directory.
Dec 10 10:07:20 compute-0 sudo[92658]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:21 compute-0 sudo[92852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iayxvjvlpbxbrrietpsduljguwixlkvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361240.7355793-460-242769955079971/AnsiballZ_file.py'
Dec 10 10:07:21 compute-0 sudo[92852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:21 compute-0 python3.9[92854]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:21 compute-0 sudo[92852]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:21 compute-0 sudo[93004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhltnoalilljlaychuhldbalcttpkklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361241.4040709-468-40384143381484/AnsiballZ_stat.py'
Dec 10 10:07:21 compute-0 sudo[93004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:21 compute-0 python3.9[93006]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:21 compute-0 sudo[93004]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:22 compute-0 sudo[93127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pejkoctgkaeeozdsgmotpwyzjbbiigop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361241.4040709-468-40384143381484/AnsiballZ_copy.py'
Dec 10 10:07:22 compute-0 sudo[93127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:22 compute-0 python3.9[93129]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361241.4040709-468-40384143381484/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:22 compute-0 sudo[93127]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:23 compute-0 sudo[93279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfdqjefhcmcreoiaodfgpmeavhtwgxya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361242.8279345-485-32282145216895/AnsiballZ_file.py'
Dec 10 10:07:23 compute-0 sudo[93279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:23 compute-0 python3.9[93281]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:23 compute-0 sudo[93279]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:23 compute-0 sudo[93431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqeiucqnyakuvywzktpgrjifgqzwsydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361243.4972975-493-25625477045689/AnsiballZ_stat.py'
Dec 10 10:07:23 compute-0 sudo[93431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:23 compute-0 python3.9[93433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:23 compute-0 sudo[93431]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:24 compute-0 sudo[93554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvdbdjawomroftrhtypgfeuacelncveo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361243.4972975-493-25625477045689/AnsiballZ_copy.py'
Dec 10 10:07:24 compute-0 sudo[93554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:24 compute-0 python3.9[93556]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361243.4972975-493-25625477045689/.source.json _original_basename=._h2ak99b follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:24 compute-0 sudo[93554]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:24 compute-0 sudo[93706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orahnyiwpnkcoftjmvmdumepzttuqjot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361244.632871-508-273414446208243/AnsiballZ_file.py'
Dec 10 10:07:24 compute-0 sudo[93706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:25 compute-0 python3.9[93708]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:25 compute-0 sudo[93706]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:25 compute-0 sudo[93858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgiongjxznpnibbecysxlymjhkjhstwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361245.3820672-516-250806661694431/AnsiballZ_stat.py'
Dec 10 10:07:25 compute-0 sudo[93858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:25 compute-0 sudo[93858]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:26 compute-0 sudo[93981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orfrwbirbrtjwubtsddwmssvobiygygc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361245.3820672-516-250806661694431/AnsiballZ_copy.py'
Dec 10 10:07:26 compute-0 sudo[93981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:26 compute-0 sudo[93981]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:27 compute-0 sudo[94133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bynkjjzuijvpmrxminylsroqyiuujnvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361246.6367602-533-125328609215022/AnsiballZ_container_config_data.py'
Dec 10 10:07:27 compute-0 sudo[94133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:27 compute-0 python3.9[94135]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 10 10:07:27 compute-0 sudo[94133]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:27 compute-0 sudo[94285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azxeqkxwfwkeanfdcccyzcebsezcbkvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361247.4744844-542-180800337369504/AnsiballZ_container_config_hash.py'
Dec 10 10:07:27 compute-0 sudo[94285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:28 compute-0 python3.9[94287]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:07:28 compute-0 sudo[94285]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:28 compute-0 sudo[94437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seixbjicnlbldecpbajilgptxbspbddh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361248.3637881-551-2068222314050/AnsiballZ_podman_container_info.py'
Dec 10 10:07:28 compute-0 sudo[94437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:29 compute-0 python3.9[94439]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 10 10:07:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:07:29 compute-0 sudo[94437]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:30 compute-0 sudo[94600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgktwgdscrkmxjvnqhqstwzaczotvpmj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361249.549464-564-165463106568278/AnsiballZ_edpm_container_manage.py'
Dec 10 10:07:30 compute-0 sudo[94600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:30 compute-0 python3[94602]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:07:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:07:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:07:30 compute-0 podman[94638]: 2025-12-10 10:07:30.503117569 +0000 UTC m=+0.041058690 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 10 10:07:30 compute-0 podman[94638]: 2025-12-10 10:07:30.696509338 +0000 UTC m=+0.234450489 container create e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 10 10:07:30 compute-0 python3[94602]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 10 10:07:30 compute-0 sudo[94600]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:31 compute-0 sudo[94826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygjrmjrumsmpfufpduslccqkhdmcdbmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361251.0147986-572-240575876571959/AnsiballZ_stat.py'
Dec 10 10:07:31 compute-0 sudo[94826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 10 10:07:31 compute-0 python3.9[94828]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:07:31 compute-0 sudo[94826]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:32 compute-0 sudo[94980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-resgcuyvmcjqdctnlkxekqaotnbwlvir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361251.726874-581-63046797370068/AnsiballZ_file.py'
Dec 10 10:07:32 compute-0 sudo[94980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:32 compute-0 python3.9[94982]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:32 compute-0 sudo[94980]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:32 compute-0 sudo[95056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apqzckfmuptkmnmzxijwdtlosrlmsaiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361251.726874-581-63046797370068/AnsiballZ_stat.py'
Dec 10 10:07:32 compute-0 sudo[95056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:32 compute-0 python3.9[95058]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:07:32 compute-0 sudo[95056]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:33 compute-0 sudo[95207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cggvgjshoobwxxekpwizxabtoawsehqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361252.6935692-581-136948965720944/AnsiballZ_copy.py'
Dec 10 10:07:33 compute-0 sudo[95207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:33 compute-0 python3.9[95209]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361252.6935692-581-136948965720944/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:07:33 compute-0 sudo[95207]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:33 compute-0 sudo[95283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fubsayauhgksvykmouqmhqrmezuyhtta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361252.6935692-581-136948965720944/AnsiballZ_systemd.py'
Dec 10 10:07:33 compute-0 sudo[95283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:33 compute-0 python3.9[95285]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:07:33 compute-0 systemd[1]: Reloading.
Dec 10 10:07:33 compute-0 systemd-sysv-generator[95316]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:07:33 compute-0 systemd-rc-local-generator[95313]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:07:34 compute-0 sudo[95283]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:34 compute-0 sudo[95394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onkzvtnmejelmdlfgojdjpckpguznxro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361252.6935692-581-136948965720944/AnsiballZ_systemd.py'
Dec 10 10:07:34 compute-0 sudo[95394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:34 compute-0 python3.9[95396]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:07:34 compute-0 systemd[1]: Reloading.
Dec 10 10:07:34 compute-0 systemd-rc-local-generator[95425]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:07:34 compute-0 systemd-sysv-generator[95428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:07:34 compute-0 systemd[1]: Starting ovn_controller container...
Dec 10 10:07:34 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 10 10:07:34 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:07:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbb03e9e02af5715adea664efd6d0fb821f8f9c0daf891a23d02126635d374cd/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 10 10:07:34 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6.
Dec 10 10:07:34 compute-0 podman[95437]: 2025-12-10 10:07:34.919605942 +0000 UTC m=+0.136902333 container init e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:07:34 compute-0 ovn_controller[95452]: + sudo -E kolla_set_configs
Dec 10 10:07:34 compute-0 podman[95437]: 2025-12-10 10:07:34.949185818 +0000 UTC m=+0.166482199 container start e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 10 10:07:34 compute-0 edpm-start-podman-container[95437]: ovn_controller
Dec 10 10:07:34 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 10 10:07:34 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 10 10:07:34 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 10 10:07:35 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 10 10:07:35 compute-0 systemd[95487]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 10 10:07:35 compute-0 edpm-start-podman-container[95436]: Creating additional drop-in dependency for "ovn_controller" (e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6)
Dec 10 10:07:35 compute-0 podman[95458]: 2025-12-10 10:07:35.029889319 +0000 UTC m=+0.069333011 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:07:35 compute-0 systemd[1]: e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6-39ac9a8ae9513f3.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:07:35 compute-0 systemd[1]: e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6-39ac9a8ae9513f3.service: Failed with result 'exit-code'.
Dec 10 10:07:35 compute-0 systemd[1]: Reloading.
Dec 10 10:07:35 compute-0 systemd-sysv-generator[95540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:07:35 compute-0 systemd-rc-local-generator[95537]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:07:35 compute-0 systemd[95487]: Queued start job for default target Main User Target.
Dec 10 10:07:35 compute-0 systemd[95487]: Created slice User Application Slice.
Dec 10 10:07:35 compute-0 systemd[95487]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 10 10:07:35 compute-0 systemd[95487]: Started Daily Cleanup of User's Temporary Directories.
Dec 10 10:07:35 compute-0 systemd[95487]: Reached target Paths.
Dec 10 10:07:35 compute-0 systemd[95487]: Reached target Timers.
Dec 10 10:07:35 compute-0 systemd[95487]: Starting D-Bus User Message Bus Socket...
Dec 10 10:07:35 compute-0 systemd[95487]: Starting Create User's Volatile Files and Directories...
Dec 10 10:07:35 compute-0 systemd[95487]: Finished Create User's Volatile Files and Directories.
Dec 10 10:07:35 compute-0 systemd[95487]: Listening on D-Bus User Message Bus Socket.
Dec 10 10:07:35 compute-0 systemd[95487]: Reached target Sockets.
Dec 10 10:07:35 compute-0 systemd[95487]: Reached target Basic System.
Dec 10 10:07:35 compute-0 systemd[95487]: Reached target Main User Target.
Dec 10 10:07:35 compute-0 systemd[95487]: Startup finished in 131ms.
Dec 10 10:07:35 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 10 10:07:35 compute-0 systemd[1]: Started ovn_controller container.
Dec 10 10:07:35 compute-0 systemd[1]: Started Session c1 of User root.
Dec 10 10:07:35 compute-0 sudo[95394]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:35 compute-0 ovn_controller[95452]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:07:35 compute-0 ovn_controller[95452]: INFO:__main__:Validating config file
Dec 10 10:07:35 compute-0 ovn_controller[95452]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:07:35 compute-0 ovn_controller[95452]: INFO:__main__:Writing out command to execute
Dec 10 10:07:35 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: ++ cat /run_command
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + ARGS=
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + sudo kolla_copy_cacerts
Dec 10 10:07:35 compute-0 systemd[1]: Started Session c2 of User root.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + [[ ! -n '' ]]
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + . kolla_extend_start
Dec 10 10:07:35 compute-0 ovn_controller[95452]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + umask 0022
Dec 10 10:07:35 compute-0 ovn_controller[95452]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 10 10:07:35 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4338] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4348] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <warn>  [1765361255.4351] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4357] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4362] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4364] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 10 10:07:35 compute-0 kernel: br-int: entered promiscuous mode
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4685] manager: (ovn-94fd96-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 10 10:07:35 compute-0 ovn_controller[95452]: 2025-12-10T10:07:35Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 10 10:07:35 compute-0 systemd-udevd[95613]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:07:35 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 10 10:07:35 compute-0 systemd-udevd[95616]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4900] device (genev_sys_6081): carrier: link connected
Dec 10 10:07:35 compute-0 NetworkManager[55541]: <info>  [1765361255.4904] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 10 10:07:35 compute-0 sudo[95715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auwdjjsutppwwjrfcgrwhhqcpmyttkxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361255.4560091-609-203871229443488/AnsiballZ_command.py'
Dec 10 10:07:35 compute-0 sudo[95715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:35 compute-0 python3.9[95717]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:35 compute-0 ovs-vsctl[95718]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 10 10:07:35 compute-0 sudo[95715]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:36 compute-0 sudo[95868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktzzmhooyjrekfsrspsidkdbcadjaivf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361256.1679776-617-97971558105587/AnsiballZ_command.py'
Dec 10 10:07:36 compute-0 sudo[95868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:36 compute-0 python3.9[95870]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:36 compute-0 ovs-vsctl[95872]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 10 10:07:36 compute-0 sudo[95868]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:37 compute-0 sudo[96023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwzajffuebtbrwkfevmhaxykctbuepie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361257.0183723-631-158891193825379/AnsiballZ_command.py'
Dec 10 10:07:37 compute-0 sudo[96023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:37 compute-0 python3.9[96025]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:07:37 compute-0 ovs-vsctl[96026]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 10 10:07:37 compute-0 sudo[96023]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:37 compute-0 sshd-session[84957]: Connection closed by 192.168.122.30 port 56284
Dec 10 10:07:37 compute-0 sshd-session[84954]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:07:38 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Dec 10 10:07:38 compute-0 systemd[1]: session-20.scope: Consumed 45.808s CPU time.
Dec 10 10:07:38 compute-0 systemd-logind[787]: Session 20 logged out. Waiting for processes to exit.
Dec 10 10:07:38 compute-0 systemd-logind[787]: Removed session 20.
Dec 10 10:07:43 compute-0 sshd-session[96051]: Accepted publickey for zuul from 192.168.122.30 port 57354 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:07:43 compute-0 systemd-logind[787]: New session 22 of user zuul.
Dec 10 10:07:43 compute-0 systemd[1]: Started Session 22 of User zuul.
Dec 10 10:07:43 compute-0 sshd-session[96051]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:07:44 compute-0 python3.9[96204]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:07:45 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 10 10:07:45 compute-0 systemd[95487]: Activating special unit Exit the Session...
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped target Main User Target.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped target Basic System.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped target Paths.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped target Sockets.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped target Timers.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 10 10:07:45 compute-0 systemd[95487]: Closed D-Bus User Message Bus Socket.
Dec 10 10:07:45 compute-0 systemd[95487]: Stopped Create User's Volatile Files and Directories.
Dec 10 10:07:45 compute-0 systemd[95487]: Removed slice User Application Slice.
Dec 10 10:07:45 compute-0 systemd[95487]: Reached target Shutdown.
Dec 10 10:07:45 compute-0 systemd[95487]: Finished Exit the Session.
Dec 10 10:07:45 compute-0 systemd[95487]: Reached target Exit the Session.
Dec 10 10:07:45 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 10 10:07:45 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 10 10:07:45 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 10 10:07:45 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 10 10:07:45 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 10 10:07:45 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 10 10:07:45 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 10 10:07:45 compute-0 sudo[96361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iexxbnbfhtksmyeuhursxdtrpgenmnhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361264.9998062-34-226472697481718/AnsiballZ_file.py'
Dec 10 10:07:45 compute-0 sudo[96361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:45 compute-0 python3.9[96363]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:45 compute-0 sudo[96361]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:46 compute-0 sudo[96513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbyymgbgpvyjgkyqqpwcnhbwogzqvolf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361265.811317-34-226519146349021/AnsiballZ_file.py'
Dec 10 10:07:46 compute-0 sudo[96513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:46 compute-0 python3.9[96515]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:46 compute-0 sudo[96513]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:46 compute-0 sudo[96665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdblsmmuvxbzxnauctacxkmeyfmpfveq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361266.3972583-34-195324044093454/AnsiballZ_file.py'
Dec 10 10:07:46 compute-0 sudo[96665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:46 compute-0 python3.9[96667]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:46 compute-0 sudo[96665]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:47 compute-0 sudo[96817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyxjomdsehkyyhdjkrxneqrtnjqrbeac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361267.0637877-34-261533969997473/AnsiballZ_file.py'
Dec 10 10:07:47 compute-0 sudo[96817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:47 compute-0 python3.9[96819]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:47 compute-0 sudo[96817]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:48 compute-0 sudo[96969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frpwvtxaqaibupfynbyuxsjkwyograuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361267.760767-34-44159551188646/AnsiballZ_file.py'
Dec 10 10:07:48 compute-0 sudo[96969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:48 compute-0 python3.9[96971]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:48 compute-0 sudo[96969]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:49 compute-0 python3.9[97121]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:07:49 compute-0 sudo[97271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqsmybhcqmdagszxslzpczozbscyztzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361269.3097668-78-23155007991527/AnsiballZ_seboolean.py'
Dec 10 10:07:49 compute-0 sudo[97271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:49 compute-0 python3.9[97273]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 10 10:07:50 compute-0 sudo[97271]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:51 compute-0 python3.9[97423]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:52 compute-0 python3.9[97544]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361270.7673204-86-251012623627040/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:52 compute-0 python3.9[97694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:53 compute-0 python3.9[97815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361272.2162027-101-186601088798328/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:53 compute-0 sudo[97966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfcjhbedletewbtddmtsmwitpermoevh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361273.4703226-118-204522831471914/AnsiballZ_setup.py'
Dec 10 10:07:53 compute-0 sudo[97966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:54 compute-0 python3.9[97968]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:07:54 compute-0 sudo[97966]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:54 compute-0 sudo[98050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvuhpobepmymjfdypbzupnbiejjkcabk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361273.4703226-118-204522831471914/AnsiballZ_dnf.py'
Dec 10 10:07:54 compute-0 sudo[98050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:54 compute-0 python3.9[98052]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:07:56 compute-0 sudo[98050]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:56 compute-0 sudo[98203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjmmtehookrjlpfibxxpjwscozgxavn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361276.3015525-130-170661659339783/AnsiballZ_systemd.py'
Dec 10 10:07:56 compute-0 sudo[98203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:07:57 compute-0 python3.9[98205]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:07:57 compute-0 sudo[98203]: pam_unix(sudo:session): session closed for user root
Dec 10 10:07:57 compute-0 python3.9[98358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:58 compute-0 python3.9[98479]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361277.430396-138-153597915547470/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:07:59 compute-0 python3.9[98629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:07:59 compute-0 python3.9[98750]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361278.6226645-138-15705461113527/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:00 compute-0 python3.9[98900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:01 compute-0 python3.9[99021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361280.3572094-182-228903934459959/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:02 compute-0 python3.9[99171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:02 compute-0 python3.9[99292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361281.5640204-182-177826833580894/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:03 compute-0 python3.9[99442]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:08:03 compute-0 sudo[99594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goxugtxicwmahhafxswyklvmgacmftba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361283.4769416-220-111709334315237/AnsiballZ_file.py'
Dec 10 10:08:03 compute-0 sudo[99594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:03 compute-0 python3.9[99596]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:04 compute-0 sudo[99594]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:04 compute-0 sudo[99746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzitfluxcmukyvmsxleyddjkwptkrcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361284.192089-228-123279777243033/AnsiballZ_stat.py'
Dec 10 10:08:04 compute-0 sudo[99746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:04 compute-0 python3.9[99748]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:04 compute-0 sudo[99746]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:04 compute-0 sudo[99824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guoytuuskcudmrykbltavkjleqvhxolb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361284.192089-228-123279777243033/AnsiballZ_file.py'
Dec 10 10:08:04 compute-0 sudo[99824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:05 compute-0 python3.9[99826]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:05 compute-0 sudo[99824]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:05 compute-0 sudo[99989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqvwlqblwlgbwkmmmtsbrqihqpxevvec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361285.4180193-228-169604148804264/AnsiballZ_stat.py'
Dec 10 10:08:05 compute-0 sudo[99989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:05 compute-0 ovn_controller[95452]: 2025-12-10T10:08:05Z|00025|memory|INFO|16128 kB peak resident set size after 30.4 seconds
Dec 10 10:08:05 compute-0 ovn_controller[95452]: 2025-12-10T10:08:05Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 10 10:08:05 compute-0 podman[99950]: 2025-12-10 10:08:05.805173796 +0000 UTC m=+0.113677290 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller)
Dec 10 10:08:05 compute-0 python3.9[99998]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:05 compute-0 sudo[99989]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:06 compute-0 sudo[100080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeulwprcddpjraiazmxufbmijyomigdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361285.4180193-228-169604148804264/AnsiballZ_file.py'
Dec 10 10:08:06 compute-0 sudo[100080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:06 compute-0 python3.9[100082]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:06 compute-0 sudo[100080]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:06 compute-0 sudo[100232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqgkjqniudgxfapflsqyxmboshdgveey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361286.5562763-251-108476857869269/AnsiballZ_file.py'
Dec 10 10:08:06 compute-0 sudo[100232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:07 compute-0 python3.9[100234]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:07 compute-0 sudo[100232]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:07 compute-0 sudo[100384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwbmaxsflvutzfxtvcvqanbwyzvzcck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361287.3718421-259-158241091607468/AnsiballZ_stat.py'
Dec 10 10:08:07 compute-0 sudo[100384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:07 compute-0 python3.9[100386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:07 compute-0 sudo[100384]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:08 compute-0 sudo[100462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htmglkbzlguxjijmhmaqilumwqzgjnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361287.3718421-259-158241091607468/AnsiballZ_file.py'
Dec 10 10:08:08 compute-0 sudo[100462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:08 compute-0 python3.9[100464]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:08 compute-0 sudo[100462]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:08 compute-0 sudo[100614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnuspqeavsamzcwxnribyzudpavjzsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361288.568107-271-22066181169128/AnsiballZ_stat.py'
Dec 10 10:08:08 compute-0 sudo[100614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:09 compute-0 python3.9[100616]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:09 compute-0 sudo[100614]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:09 compute-0 sudo[100692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azufdaeyatrxnkgqojdqmrbawzzrnitf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361288.568107-271-22066181169128/AnsiballZ_file.py'
Dec 10 10:08:09 compute-0 sudo[100692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:09 compute-0 python3.9[100694]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:09 compute-0 sudo[100692]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:10 compute-0 sudo[100844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzwwufrobglmzpulmtrvvwrsegfjcels ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361289.7289057-283-51178390215130/AnsiballZ_systemd.py'
Dec 10 10:08:10 compute-0 sudo[100844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:10 compute-0 python3.9[100846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:10 compute-0 systemd[1]: Reloading.
Dec 10 10:08:10 compute-0 systemd-sysv-generator[100877]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:10 compute-0 systemd-rc-local-generator[100874]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:10 compute-0 sudo[100844]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:11 compute-0 sudo[101033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwrirpenofsfyihzvtxmmmvmokpzpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361290.7526264-291-41444709625406/AnsiballZ_stat.py'
Dec 10 10:08:11 compute-0 sudo[101033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:11 compute-0 python3.9[101035]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:11 compute-0 sudo[101033]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:11 compute-0 sudo[101111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzbxfbavlqwsiewclayuwkzgcwxvkcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361290.7526264-291-41444709625406/AnsiballZ_file.py'
Dec 10 10:08:11 compute-0 sudo[101111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:11 compute-0 python3.9[101113]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:11 compute-0 sudo[101111]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:12 compute-0 sudo[101263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipheryjpdnyzzqpnaqjvhcjxysghsczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361291.8946772-303-37176909692910/AnsiballZ_stat.py'
Dec 10 10:08:12 compute-0 sudo[101263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:12 compute-0 python3.9[101265]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:12 compute-0 sudo[101263]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:12 compute-0 sudo[101341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuwopfihtpfpbsqlmcdlvtxbxgjzoxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361291.8946772-303-37176909692910/AnsiballZ_file.py'
Dec 10 10:08:12 compute-0 sudo[101341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:12 compute-0 python3.9[101343]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:12 compute-0 sudo[101341]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:13 compute-0 sudo[101493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrzbirhakuipsrdgbeaumrjsdwlndufj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361292.9976163-315-275783895475265/AnsiballZ_systemd.py'
Dec 10 10:08:13 compute-0 sudo[101493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:13 compute-0 python3.9[101495]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:13 compute-0 systemd[1]: Reloading.
Dec 10 10:08:13 compute-0 systemd-sysv-generator[101524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:13 compute-0 systemd-rc-local-generator[101519]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:13 compute-0 systemd[1]: Starting Create netns directory...
Dec 10 10:08:13 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 10 10:08:13 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 10 10:08:13 compute-0 systemd[1]: Finished Create netns directory.
Dec 10 10:08:13 compute-0 sudo[101493]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:14 compute-0 sudo[101686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzbjzosieyicoipvkyvswlionbjqztb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361294.1860106-325-91634821188471/AnsiballZ_file.py'
Dec 10 10:08:14 compute-0 sudo[101686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:14 compute-0 python3.9[101688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:14 compute-0 sudo[101686]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:15 compute-0 sudo[101838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blwdzytabyxkrxiccdggnpvisvjxosee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361294.8524196-333-135624749023744/AnsiballZ_stat.py'
Dec 10 10:08:15 compute-0 sudo[101838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:15 compute-0 python3.9[101840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:15 compute-0 sudo[101838]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:15 compute-0 sudo[101961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlrascbvgokmmvjctrukrfnxkljmmtpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361294.8524196-333-135624749023744/AnsiballZ_copy.py'
Dec 10 10:08:15 compute-0 sudo[101961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:16 compute-0 python3.9[101963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361294.8524196-333-135624749023744/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:16 compute-0 sudo[101961]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:16 compute-0 sudo[102113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjborhizzabptvueasmttahrxgwoxoln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361296.48754-350-9220572910378/AnsiballZ_file.py'
Dec 10 10:08:16 compute-0 sudo[102113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:16 compute-0 python3.9[102115]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:08:17 compute-0 sudo[102113]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:17 compute-0 sudo[102265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpiljwzglykutxdwypyokzkpoyxqdga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361297.1926017-358-245099497885977/AnsiballZ_stat.py'
Dec 10 10:08:17 compute-0 sudo[102265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:17 compute-0 python3.9[102267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:08:17 compute-0 sudo[102265]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:18 compute-0 sudo[102388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-copatdzhzhepnpwpilsabwpxvmhnwowt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361297.1926017-358-245099497885977/AnsiballZ_copy.py'
Dec 10 10:08:18 compute-0 sudo[102388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:18 compute-0 python3.9[102390]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361297.1926017-358-245099497885977/.source.json _original_basename=.hd2x2d2k follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:18 compute-0 sudo[102388]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:18 compute-0 sudo[102540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wppexynecgfflbrxeurdgmiwqnwdpejw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361298.613074-373-178153533157603/AnsiballZ_file.py'
Dec 10 10:08:18 compute-0 sudo[102540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:19 compute-0 python3.9[102542]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:19 compute-0 sudo[102540]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:19 compute-0 sudo[102692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwicpspzszviiafhkugedylplfrepzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361299.2936647-381-77588282827251/AnsiballZ_stat.py'
Dec 10 10:08:19 compute-0 sudo[102692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:19 compute-0 sudo[102692]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:20 compute-0 sudo[102815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzavnqwyelzktrnrsaorhxdpylohptvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361299.2936647-381-77588282827251/AnsiballZ_copy.py'
Dec 10 10:08:20 compute-0 sudo[102815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:20 compute-0 sudo[102815]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:21 compute-0 sudo[102967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozlzmjtspcodarzvevojudfvbbdpphhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361300.6421373-398-226371490307323/AnsiballZ_container_config_data.py'
Dec 10 10:08:21 compute-0 sudo[102967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:21 compute-0 python3.9[102969]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 10 10:08:21 compute-0 sudo[102967]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:22 compute-0 sudo[103119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujqucjxekgcjfayrcpatagzjogcacrax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361301.5355554-407-6720810439097/AnsiballZ_container_config_hash.py'
Dec 10 10:08:22 compute-0 sudo[103119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:22 compute-0 python3.9[103121]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:08:22 compute-0 sudo[103119]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:23 compute-0 sudo[103271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vabsroaqqpvpddsvdjeyanlhvlkxgdms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361302.568064-416-203794157079919/AnsiballZ_podman_container_info.py'
Dec 10 10:08:23 compute-0 sudo[103271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:23 compute-0 python3.9[103273]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 10 10:08:23 compute-0 sudo[103271]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:24 compute-0 sudo[103446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-errwdagbwszfckudwekgoptgdidstvtq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361303.7996974-429-146943558406952/AnsiballZ_edpm_container_manage.py'
Dec 10 10:08:24 compute-0 sudo[103446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:24 compute-0 python3[103448]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:08:24 compute-0 podman[103483]: 2025-12-10 10:08:24.805356762 +0000 UTC m=+0.050670115 container create 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:08:24 compute-0 podman[103483]: 2025-12-10 10:08:24.77835435 +0000 UTC m=+0.023667733 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:08:24 compute-0 python3[103448]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:08:24 compute-0 sudo[103446]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:25 compute-0 sudo[103670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackhguurwihrbgdklebavyiymcbsstdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361305.1164365-437-85188333388242/AnsiballZ_stat.py'
Dec 10 10:08:25 compute-0 sudo[103670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:25 compute-0 python3.9[103672]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:08:25 compute-0 sudo[103670]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:26 compute-0 sudo[103824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syimeqjlqxksradlukngqdmfwmgswluk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361305.830548-446-128028003351561/AnsiballZ_file.py'
Dec 10 10:08:26 compute-0 sudo[103824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:26 compute-0 python3.9[103826]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:26 compute-0 sudo[103824]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:26 compute-0 sudo[103900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rszbghxtaadjuefdjrcldgohmnytuhyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361305.830548-446-128028003351561/AnsiballZ_stat.py'
Dec 10 10:08:26 compute-0 sudo[103900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:26 compute-0 python3.9[103902]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:08:26 compute-0 sudo[103900]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:27 compute-0 sudo[104051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siuilldotuhyzshpuxbetrraybttakkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361306.9067175-446-136692503821725/AnsiballZ_copy.py'
Dec 10 10:08:27 compute-0 sudo[104051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:27 compute-0 python3.9[104053]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361306.9067175-446-136692503821725/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:27 compute-0 sudo[104051]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:27 compute-0 sudo[104127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsqljzpqyvweortnpmjqbnxvuxzesha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361306.9067175-446-136692503821725/AnsiballZ_systemd.py'
Dec 10 10:08:27 compute-0 sudo[104127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:28 compute-0 python3.9[104129]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:08:28 compute-0 systemd[1]: Reloading.
Dec 10 10:08:28 compute-0 systemd-sysv-generator[104160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:28 compute-0 systemd-rc-local-generator[104155]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:28 compute-0 sudo[104127]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:28 compute-0 sudo[104239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmoudxfsdgbdwmwcaokvovfxrzlrbej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361306.9067175-446-136692503821725/AnsiballZ_systemd.py'
Dec 10 10:08:28 compute-0 sudo[104239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:29 compute-0 python3.9[104241]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:29 compute-0 systemd[1]: Reloading.
Dec 10 10:08:29 compute-0 systemd-sysv-generator[104270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:29 compute-0 systemd-rc-local-generator[104266]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:29 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 10 10:08:29 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:08:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3f5de382ebcd98e94f5e6d131ff1d02116e5aab0b19ad2e5aa3a001a52a5548/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 10 10:08:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3f5de382ebcd98e94f5e6d131ff1d02116e5aab0b19ad2e5aa3a001a52a5548/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:08:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4.
Dec 10 10:08:29 compute-0 podman[104282]: 2025-12-10 10:08:29.508364877 +0000 UTC m=+0.135974633 container init 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + sudo -E kolla_set_configs
Dec 10 10:08:29 compute-0 podman[104282]: 2025-12-10 10:08:29.544390451 +0000 UTC m=+0.172000167 container start 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 10 10:08:29 compute-0 edpm-start-podman-container[104282]: ovn_metadata_agent
Dec 10 10:08:29 compute-0 podman[104304]: 2025-12-10 10:08:29.617277695 +0000 UTC m=+0.060197996 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Validating config file
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Copying service configuration files
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Writing out command to execute
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: ++ cat /run_command
Dec 10 10:08:29 compute-0 edpm-start-podman-container[104281]: Creating additional drop-in dependency for "ovn_metadata_agent" (3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4)
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + CMD=neutron-ovn-metadata-agent
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + ARGS=
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + sudo kolla_copy_cacerts
Dec 10 10:08:29 compute-0 systemd[1]: Reloading.
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + [[ ! -n '' ]]
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + . kolla_extend_start
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: Running command: 'neutron-ovn-metadata-agent'
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + umask 0022
Dec 10 10:08:29 compute-0 ovn_metadata_agent[104297]: + exec neutron-ovn-metadata-agent
Dec 10 10:08:29 compute-0 systemd-rc-local-generator[104373]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:29 compute-0 systemd-sysv-generator[104376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:29 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 10 10:08:29 compute-0 sudo[104239]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:30 compute-0 sshd-session[96054]: Connection closed by 192.168.122.30 port 57354
Dec 10 10:08:30 compute-0 sshd-session[96051]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:08:30 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Dec 10 10:08:30 compute-0 systemd[1]: session-22.scope: Consumed 36.236s CPU time.
Dec 10 10:08:30 compute-0 systemd-logind[787]: Session 22 logged out. Waiting for processes to exit.
Dec 10 10:08:30 compute-0 systemd-logind[787]: Removed session 22.
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.399 104302 INFO neutron.common.config [-] Logging enabled!
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.400 104302 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.400 104302 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.400 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.400 104302 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.400 104302 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.401 104302 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.402 104302 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.403 104302 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.404 104302 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.405 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.406 104302 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.407 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.408 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.409 104302 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.410 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.411 104302 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.412 104302 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.413 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.414 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.415 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.416 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.417 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.418 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.419 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.420 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.421 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.422 104302 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.423 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.424 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.425 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.426 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.427 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.428 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.429 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.430 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.431 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.432 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.433 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.434 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.435 104302 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.436 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.437 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.437 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.437 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.437 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.437 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.438 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.439 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.440 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.441 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.442 104302 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.442 104302 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.453 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.453 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.453 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.454 104302 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.454 104302 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.469 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 65d7f098-ee7c-47ff-b5dd-8c0c64a94f34 (UUID: 65d7f098-ee7c-47ff-b5dd-8c0c64a94f34) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.495 104302 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.495 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.495 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.495 104302 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.498 104302 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.504 104302 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.509 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '65d7f098-ee7c-47ff-b5dd-8c0c64a94f34'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], external_ids={}, name=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, nb_cfg_timestamp=1765361263450, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.510 104302 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f84603c0130>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.511 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.511 104302 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.511 104302 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.511 104302 INFO oslo_service.service [-] Starting 1 workers
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.515 104302 DEBUG oslo_service.service [-] Started child 104409 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.518 104302 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpss6wv8g7/privsep.sock']
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.519 104409 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-391483'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.546 104409 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.547 104409 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.547 104409 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.552 104409 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.558 104409 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 10 10:08:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:31.566 104409 INFO eventlet.wsgi.server [-] (104409) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 10 10:08:32 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.223 104302 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.224 104302 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpss6wv8g7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.068 104414 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.072 104414 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.075 104414 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.075 104414 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104414
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.227 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[bd00b0ed-0cdf-4460-befe-38517319daa3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.732 104414 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.732 104414 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:08:32 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:32.732 104414 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.255 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[5696d927-9b04-4e4d-91b0-d73a2b7314f2]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.257 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, column=external_ids, values=({'neutron:ovn-metadata-id': 'defc7a2d-dd1c-54db-b7db-6188357783ef'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.272 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.279 104302 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.279 104302 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.279 104302 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.279 104302 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.279 104302 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.280 104302 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.280 104302 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.280 104302 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.280 104302 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.281 104302 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.281 104302 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.281 104302 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.281 104302 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.281 104302 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.282 104302 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.282 104302 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.282 104302 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.282 104302 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.282 104302 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.283 104302 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.283 104302 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.283 104302 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.283 104302 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.284 104302 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.284 104302 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.284 104302 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.284 104302 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.284 104302 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.285 104302 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.285 104302 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.285 104302 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.285 104302 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.285 104302 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.286 104302 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.286 104302 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.286 104302 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.286 104302 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.286 104302 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.287 104302 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.288 104302 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.289 104302 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.290 104302 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.291 104302 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.292 104302 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.293 104302 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.294 104302 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.295 104302 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.295 104302 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.295 104302 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.295 104302 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.295 104302 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.296 104302 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.297 104302 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.298 104302 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.299 104302 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.299 104302 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.299 104302 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.299 104302 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.299 104302 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.300 104302 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.301 104302 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.301 104302 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.301 104302 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.301 104302 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.301 104302 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.302 104302 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.303 104302 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.304 104302 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.305 104302 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.306 104302 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.307 104302 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.308 104302 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.309 104302 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.310 104302 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.311 104302 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.312 104302 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.313 104302 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.314 104302 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.315 104302 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.316 104302 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.317 104302 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.318 104302 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.319 104302 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.320 104302 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.321 104302 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.322 104302 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.323 104302 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.324 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.324 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.324 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.324 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.324 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.325 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.326 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.327 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.328 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:08:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:08:33.329 104302 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:08:36 compute-0 podman[104419]: 2025-12-10 10:08:36.065998172 +0000 UTC m=+0.106105427 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 10 10:08:36 compute-0 sshd-session[104447]: Accepted publickey for zuul from 192.168.122.30 port 55526 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:08:36 compute-0 systemd-logind[787]: New session 23 of user zuul.
Dec 10 10:08:36 compute-0 systemd[1]: Started Session 23 of User zuul.
Dec 10 10:08:36 compute-0 sshd-session[104447]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:08:37 compute-0 python3.9[104600]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:08:38 compute-0 sudo[104754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjzrhouackydldpmhpprveuapxqmbsfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361318.1195438-34-143583033990919/AnsiballZ_command.py'
Dec 10 10:08:38 compute-0 sudo[104754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:38 compute-0 python3.9[104756]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:08:39 compute-0 sudo[104754]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:39 compute-0 sudo[104917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzmdxmtdxqjfperjbfobseftvtlgzzvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361319.337558-45-2587530355170/AnsiballZ_systemd_service.py'
Dec 10 10:08:39 compute-0 sudo[104917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:40 compute-0 python3.9[104919]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:08:40 compute-0 systemd[1]: Reloading.
Dec 10 10:08:40 compute-0 systemd-rc-local-generator[104945]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:08:40 compute-0 systemd-sysv-generator[104948]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:08:40 compute-0 sudo[104917]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:41 compute-0 python3.9[105103]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:08:41 compute-0 network[105120]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:08:41 compute-0 network[105121]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:08:41 compute-0 network[105122]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:08:44 compute-0 sudo[105381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orxkzxhsprcckshhyssfdzsobpmpctqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361324.1674516-64-262012007841826/AnsiballZ_systemd_service.py'
Dec 10 10:08:44 compute-0 sudo[105381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:44 compute-0 python3.9[105383]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:44 compute-0 sudo[105381]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:45 compute-0 sudo[105534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dudhcrzdniabooyfjukuzmvlyboeyhbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361324.9090211-64-170471124593552/AnsiballZ_systemd_service.py'
Dec 10 10:08:45 compute-0 sudo[105534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:45 compute-0 python3.9[105536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:45 compute-0 sudo[105534]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:45 compute-0 sudo[105687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbetanyaeixbtvbxmpugniyaqonbthiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361325.679153-64-185488378712128/AnsiballZ_systemd_service.py'
Dec 10 10:08:45 compute-0 sudo[105687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:46 compute-0 python3.9[105689]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:46 compute-0 sudo[105687]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:46 compute-0 sudo[105840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reqajsyjbifkenxvlotjhzmqtfeccspu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361326.3720417-64-64453620889795/AnsiballZ_systemd_service.py'
Dec 10 10:08:46 compute-0 sudo[105840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:46 compute-0 python3.9[105842]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:46 compute-0 sudo[105840]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:47 compute-0 sudo[105993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvktyoeghzmpzwdnxeqdlpchjbmquky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361327.1209853-64-54278294580287/AnsiballZ_systemd_service.py'
Dec 10 10:08:47 compute-0 sudo[105993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:47 compute-0 python3.9[105995]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:47 compute-0 sudo[105993]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:48 compute-0 sudo[106146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yafohyydujgiqmqummlxcjvlaujcxith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361327.9196832-64-60353934853135/AnsiballZ_systemd_service.py'
Dec 10 10:08:48 compute-0 sudo[106146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:48 compute-0 python3.9[106148]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:48 compute-0 sudo[106146]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:49 compute-0 sudo[106299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viqlhsvrbgctyvygjbmufbhicghimpvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361328.7965753-64-69869520184961/AnsiballZ_systemd_service.py'
Dec 10 10:08:49 compute-0 sudo[106299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:49 compute-0 python3.9[106301]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:08:49 compute-0 sudo[106299]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:50 compute-0 sudo[106452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqupnemilmcskvisgpyudsxtoonnlfug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361329.71516-116-165980421547496/AnsiballZ_file.py'
Dec 10 10:08:50 compute-0 sudo[106452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:50 compute-0 python3.9[106454]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:50 compute-0 sudo[106452]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:50 compute-0 sudo[106604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ierwkewzfiirxvtwwkanyetysqsveods ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361330.7155437-116-150244563458456/AnsiballZ_file.py'
Dec 10 10:08:51 compute-0 sudo[106604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:51 compute-0 python3.9[106606]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:51 compute-0 sudo[106604]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:51 compute-0 sudo[106756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plbvyeklclaynkpcjsyxgmfbaghfcapq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361331.339013-116-108140924512479/AnsiballZ_file.py'
Dec 10 10:08:51 compute-0 sudo[106756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:51 compute-0 python3.9[106758]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:51 compute-0 sudo[106756]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:52 compute-0 sudo[106908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szbmjqzctstzzxfyghbyztbbinaqyqai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361331.9359858-116-226846071990281/AnsiballZ_file.py'
Dec 10 10:08:52 compute-0 sudo[106908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:52 compute-0 python3.9[106910]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:52 compute-0 sudo[106908]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:52 compute-0 sudo[107060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zddiltmykzqvwvntmnotepgditcagrym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361332.5585701-116-145525215124776/AnsiballZ_file.py'
Dec 10 10:08:52 compute-0 sudo[107060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:53 compute-0 python3.9[107062]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:53 compute-0 sudo[107060]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:53 compute-0 sudo[107212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbmdffcpdbwnaeygmiwwggcxbzmidiss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361333.1385958-116-140511292109913/AnsiballZ_file.py'
Dec 10 10:08:53 compute-0 sudo[107212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:53 compute-0 python3.9[107214]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:53 compute-0 sudo[107212]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:54 compute-0 sudo[107364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfmxbqvrksurfyklthcxaiebhipgbane ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361333.7406518-116-105684264346429/AnsiballZ_file.py'
Dec 10 10:08:54 compute-0 sudo[107364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:54 compute-0 python3.9[107366]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:54 compute-0 sudo[107364]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:54 compute-0 sudo[107516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpseulvkomyiikfxgwcxyfzkmremgjyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361334.3866878-166-49835238337072/AnsiballZ_file.py'
Dec 10 10:08:54 compute-0 sudo[107516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:54 compute-0 python3.9[107518]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:54 compute-0 sudo[107516]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:55 compute-0 sudo[107668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pplshwaqitvyuwgqszoipzpekbwginut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361334.958194-166-163861424419742/AnsiballZ_file.py'
Dec 10 10:08:55 compute-0 sudo[107668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:55 compute-0 python3.9[107670]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:55 compute-0 sudo[107668]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:55 compute-0 sudo[107820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlpolmxlricbippkqdlnhiitkhbxinhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361335.538855-166-236538703947679/AnsiballZ_file.py'
Dec 10 10:08:55 compute-0 sudo[107820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:56 compute-0 python3.9[107822]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:56 compute-0 sudo[107820]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:56 compute-0 sudo[107972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzozsatzeaoldlysrzghkzkddqicqwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361336.2501605-166-150572359436767/AnsiballZ_file.py'
Dec 10 10:08:56 compute-0 sudo[107972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:56 compute-0 python3.9[107974]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:56 compute-0 sudo[107972]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:57 compute-0 sudo[108124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpyysonsmzegitynmofxsvpgiufjqgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361336.951466-166-207338774680054/AnsiballZ_file.py'
Dec 10 10:08:57 compute-0 sudo[108124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:57 compute-0 python3.9[108126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:57 compute-0 sudo[108124]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:57 compute-0 sudo[108276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofuqyxnymhqlynrljpxdxofidrotyoga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361337.5633743-166-190387483483599/AnsiballZ_file.py'
Dec 10 10:08:57 compute-0 sudo[108276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:58 compute-0 python3.9[108278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:58 compute-0 sudo[108276]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:58 compute-0 sudo[108428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcjgysgiqyfopsbvglxrojzcreuxmpfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361338.23181-166-35205140623395/AnsiballZ_file.py'
Dec 10 10:08:58 compute-0 sudo[108428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:58 compute-0 python3.9[108430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:08:58 compute-0 sudo[108428]: pam_unix(sudo:session): session closed for user root
Dec 10 10:08:59 compute-0 sudo[108580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltjybkdkrbbxkxcesnbvnvnyglyoiukx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361338.867992-217-4193998780451/AnsiballZ_command.py'
Dec 10 10:08:59 compute-0 sudo[108580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:08:59 compute-0 python3.9[108582]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:08:59 compute-0 sudo[108580]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:00 compute-0 podman[108705]: 2025-12-10 10:09:00.035467082 +0000 UTC m=+0.084162862 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:09:00 compute-0 python3.9[108753]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:09:00 compute-0 sudo[108906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdjriejizmnasuzqpieywetqycyaabxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361340.475074-235-47469470146411/AnsiballZ_systemd_service.py'
Dec 10 10:09:00 compute-0 sudo[108906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:01 compute-0 python3.9[108908]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:09:01 compute-0 systemd[1]: Reloading.
Dec 10 10:09:01 compute-0 systemd-sysv-generator[108939]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:09:01 compute-0 systemd-rc-local-generator[108936]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:09:01 compute-0 sudo[108906]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:01 compute-0 sudo[109094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdozstggpguasznshnmeirwducqiurpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361341.4889772-243-249446857952160/AnsiballZ_command.py'
Dec 10 10:09:01 compute-0 sudo[109094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:01 compute-0 python3.9[109096]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:01 compute-0 sudo[109094]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:02 compute-0 sudo[109247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osnedccgmttuzmuvbuiajtortbvvdfcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361342.0609744-243-221226746558953/AnsiballZ_command.py'
Dec 10 10:09:02 compute-0 sudo[109247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:02 compute-0 python3.9[109249]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:02 compute-0 sudo[109247]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:03 compute-0 sudo[109400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwpphavfglrqrknbcmuxaqtjppdlduca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361342.8872864-243-237952962055402/AnsiballZ_command.py'
Dec 10 10:09:03 compute-0 sudo[109400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:03 compute-0 python3.9[109402]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:03 compute-0 sudo[109400]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:03 compute-0 sudo[109553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqirgdffxembpolzdrzgpisrwllzoqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361343.6495202-243-135127621608115/AnsiballZ_command.py'
Dec 10 10:09:03 compute-0 sudo[109553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:04 compute-0 python3.9[109555]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:04 compute-0 sudo[109553]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:04 compute-0 sudo[109706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekodoouvruajphnrepdxwfynubpphgey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361344.315581-243-149587021716576/AnsiballZ_command.py'
Dec 10 10:09:04 compute-0 sudo[109706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:04 compute-0 python3.9[109708]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:04 compute-0 sudo[109706]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:05 compute-0 sudo[109859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmupgxcxomwwyrwzpbacosmkidjdufpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361344.999618-243-186688291712554/AnsiballZ_command.py'
Dec 10 10:09:05 compute-0 sudo[109859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:05 compute-0 python3.9[109861]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:05 compute-0 sudo[109859]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:05 compute-0 sudo[110012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iymbwbmtadqtkuwjyreljsbkyeyyhuov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361345.6575866-243-75471978826681/AnsiballZ_command.py'
Dec 10 10:09:05 compute-0 sudo[110012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:06 compute-0 python3.9[110014]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:09:06 compute-0 sudo[110012]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:06 compute-0 podman[110016]: 2025-12-10 10:09:06.327602143 +0000 UTC m=+0.134369909 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 10 10:09:06 compute-0 sudo[110191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pihuaozbydipzmdssuhcwmgxvaolhxaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361346.5248554-297-105299267873665/AnsiballZ_getent.py'
Dec 10 10:09:06 compute-0 sudo[110191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:07 compute-0 python3.9[110193]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 10 10:09:07 compute-0 sudo[110191]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:07 compute-0 sudo[110344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcmnrhethjurggsqmfhfqbvzwjmmvitg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361347.3021865-305-214681525792740/AnsiballZ_group.py'
Dec 10 10:09:07 compute-0 sudo[110344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:08 compute-0 python3.9[110346]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:09:08 compute-0 groupadd[110347]: group added to /etc/group: name=libvirt, GID=42473
Dec 10 10:09:08 compute-0 groupadd[110347]: group added to /etc/gshadow: name=libvirt
Dec 10 10:09:08 compute-0 groupadd[110347]: new group: name=libvirt, GID=42473
Dec 10 10:09:08 compute-0 sudo[110344]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:08 compute-0 sudo[110502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmjfuhgdmbxcxhwnqieypkxcyapvjfsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361348.3713806-313-263490952482583/AnsiballZ_user.py'
Dec 10 10:09:08 compute-0 sudo[110502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:09 compute-0 python3.9[110504]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 10 10:09:09 compute-0 useradd[110506]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 10 10:09:09 compute-0 sudo[110502]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:09 compute-0 sudo[110662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noysomlkfoldadnqniydahrmstznrptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361349.7075274-324-50676335079786/AnsiballZ_setup.py'
Dec 10 10:09:09 compute-0 sudo[110662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:10 compute-0 python3.9[110664]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:09:10 compute-0 sudo[110662]: pam_unix(sudo:session): session closed for user root
Dec 10 10:09:10 compute-0 sudo[110746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgwtvizwhwehmnlgvrypvmpnosxgpdop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361349.7075274-324-50676335079786/AnsiballZ_dnf.py'
Dec 10 10:09:10 compute-0 sudo[110746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:09:11 compute-0 python3.9[110748]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:09:31 compute-0 podman[110939]: 2025-12-10 10:09:31.09591863 +0000 UTC m=+0.121562790 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:09:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:09:31.444 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:09:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:09:31.445 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:09:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:09:31.445 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:09:36 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 10:09:36 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 10:09:36 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 10 10:09:37 compute-0 podman[110965]: 2025-12-10 10:09:37.100070258 +0000 UTC m=+0.132830874 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 10 10:09:46 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 10:09:46 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 10:10:01 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 10 10:10:02 compute-0 podman[113044]: 2025-12-10 10:10:02.033601898 +0000 UTC m=+0.059296471 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 10 10:10:08 compute-0 podman[116734]: 2025-12-10 10:10:08.095574385 +0000 UTC m=+0.141146941 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 10 10:10:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:10:31.446 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:10:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:10:31.446 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:10:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:10:31.447 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:10:33 compute-0 podman[127846]: 2025-12-10 10:10:33.075513626 +0000 UTC m=+0.101754482 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 10 10:10:39 compute-0 podman[127867]: 2025-12-10 10:10:39.226842577 +0000 UTC m=+0.092547211 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 10 10:10:40 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 10 10:10:40 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 10 10:10:42 compute-0 groupadd[127904]: group added to /etc/group: name=dnsmasq, GID=992
Dec 10 10:10:42 compute-0 groupadd[127904]: group added to /etc/gshadow: name=dnsmasq
Dec 10 10:10:42 compute-0 groupadd[127904]: new group: name=dnsmasq, GID=992
Dec 10 10:10:42 compute-0 useradd[127911]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 10 10:10:42 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 10:10:42 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 10 10:10:42 compute-0 dbus-broker-launch[745]: Noticed file-system modification, trigger reload.
Dec 10 10:10:44 compute-0 groupadd[127924]: group added to /etc/group: name=clevis, GID=991
Dec 10 10:10:44 compute-0 groupadd[127924]: group added to /etc/gshadow: name=clevis
Dec 10 10:10:44 compute-0 groupadd[127924]: new group: name=clevis, GID=991
Dec 10 10:10:44 compute-0 useradd[127931]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 10 10:10:44 compute-0 usermod[127941]: add 'clevis' to group 'tss'
Dec 10 10:10:44 compute-0 usermod[127941]: add 'clevis' to shadow group 'tss'
Dec 10 10:10:49 compute-0 polkitd[43606]: Reloading rules
Dec 10 10:10:49 compute-0 polkitd[43606]: Collecting garbage unconditionally...
Dec 10 10:10:49 compute-0 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 10 10:10:49 compute-0 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 10 10:10:49 compute-0 polkitd[43606]: Finished loading, compiling and executing 3 rules
Dec 10 10:10:49 compute-0 polkitd[43606]: Reloading rules
Dec 10 10:10:49 compute-0 polkitd[43606]: Collecting garbage unconditionally...
Dec 10 10:10:49 compute-0 polkitd[43606]: Loading rules from directory /etc/polkit-1/rules.d
Dec 10 10:10:49 compute-0 polkitd[43606]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 10 10:10:49 compute-0 polkitd[43606]: Finished loading, compiling and executing 3 rules
Dec 10 10:10:52 compute-0 groupadd[128128]: group added to /etc/group: name=ceph, GID=167
Dec 10 10:10:52 compute-0 groupadd[128128]: group added to /etc/gshadow: name=ceph
Dec 10 10:10:52 compute-0 groupadd[128128]: new group: name=ceph, GID=167
Dec 10 10:10:52 compute-0 useradd[128134]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 10 10:10:57 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 10 10:10:57 compute-0 sshd[1007]: Received signal 15; terminating.
Dec 10 10:10:57 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 10 10:10:57 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 10 10:10:57 compute-0 systemd[1]: sshd.service: Consumed 1.661s CPU time, read 32.0K from disk, written 8.0K to disk.
Dec 10 10:10:57 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 10 10:10:57 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 10 10:10:57 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 10:10:57 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 10:10:57 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 10 10:10:57 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 10 10:10:57 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 10 10:10:57 compute-0 sshd[128653]: Server listening on 0.0.0.0 port 22.
Dec 10 10:10:57 compute-0 sshd[128653]: Server listening on :: port 22.
Dec 10 10:10:57 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 10 10:10:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:10:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:10:59 compute-0 systemd[1]: Reloading.
Dec 10 10:10:59 compute-0 systemd-sysv-generator[128911]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:10:59 compute-0 systemd-rc-local-generator[128907]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:10:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:11:04 compute-0 podman[133624]: 2025-12-10 10:11:04.033824206 +0000 UTC m=+0.070033052 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 10 10:11:05 compute-0 sudo[110746]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:06 compute-0 sudo[136043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsjllmngxflxcrluskkludhzzagastkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361466.0887995-336-12611800431262/AnsiballZ_systemd.py'
Dec 10 10:11:06 compute-0 sudo[136043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:07 compute-0 python3.9[136071]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:11:07 compute-0 systemd[1]: Reloading.
Dec 10 10:11:07 compute-0 systemd-sysv-generator[136594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:07 compute-0 systemd-rc-local-generator[136587]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:07 compute-0 sudo[136043]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:07 compute-0 sudo[137273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdfdjppvsoiyejauczhbbsrrkfkrbldn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361467.5061967-336-176683577108066/AnsiballZ_systemd.py'
Dec 10 10:11:07 compute-0 sudo[137273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:08 compute-0 python3.9[137302]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:11:08 compute-0 systemd[1]: Reloading.
Dec 10 10:11:08 compute-0 systemd-rc-local-generator[137571]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:08 compute-0 systemd-sysv-generator[137574]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:08 compute-0 sudo[137273]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:08 compute-0 sudo[137844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxqqrbeduvoywivggecbadtifqidnmez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361468.64739-336-97386371288282/AnsiballZ_systemd.py'
Dec 10 10:11:08 compute-0 sudo[137844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:09 compute-0 python3.9[137846]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:11:09 compute-0 systemd[1]: Reloading.
Dec 10 10:11:09 compute-0 systemd-rc-local-generator[137901]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:09 compute-0 systemd-sysv-generator[137905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:09 compute-0 podman[137849]: 2025-12-10 10:11:09.446738183 +0000 UTC m=+0.135311834 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:11:09 compute-0 sudo[137844]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:09 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:11:09 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:11:09 compute-0 systemd[1]: man-db-cache-update.service: Consumed 10.622s CPU time.
Dec 10 10:11:09 compute-0 systemd[1]: run-r694bd048bf664eb0b69b1126acb2bb06.service: Deactivated successfully.
Dec 10 10:11:10 compute-0 sudo[138062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjuyusrroisoftztokuumwbwzxneatbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361469.9444928-336-268560182269699/AnsiballZ_systemd.py'
Dec 10 10:11:10 compute-0 sudo[138062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:10 compute-0 python3.9[138064]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:11:10 compute-0 systemd[1]: Reloading.
Dec 10 10:11:10 compute-0 systemd-sysv-generator[138092]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:10 compute-0 systemd-rc-local-generator[138085]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:11 compute-0 sudo[138062]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:11 compute-0 sudo[138251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iukhqjqfngxexqgktiljqbhojvzngjza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361471.2456033-365-249304282035148/AnsiballZ_systemd.py'
Dec 10 10:11:11 compute-0 sudo[138251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:11 compute-0 python3.9[138253]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:11 compute-0 systemd[1]: Reloading.
Dec 10 10:11:12 compute-0 systemd-sysv-generator[138283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:12 compute-0 systemd-rc-local-generator[138279]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:12 compute-0 sudo[138251]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:12 compute-0 sudo[138441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvwbyhhvpuslcmokshzybbexgtvgntvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361472.3672032-365-266043876644748/AnsiballZ_systemd.py'
Dec 10 10:11:12 compute-0 sudo[138441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:12 compute-0 python3.9[138443]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:13 compute-0 systemd[1]: Reloading.
Dec 10 10:11:13 compute-0 systemd-rc-local-generator[138472]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:13 compute-0 systemd-sysv-generator[138477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:13 compute-0 sudo[138441]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:13 compute-0 sudo[138632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bthpjhgzchixmcdyxzomhsnhmriyegtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361473.4228654-365-209546089184186/AnsiballZ_systemd.py'
Dec 10 10:11:13 compute-0 sudo[138632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:14 compute-0 python3.9[138634]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:14 compute-0 systemd[1]: Reloading.
Dec 10 10:11:14 compute-0 systemd-rc-local-generator[138659]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:14 compute-0 systemd-sysv-generator[138666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:14 compute-0 sudo[138632]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:14 compute-0 sudo[138822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckhqusffunqvuqsqgghhonzkbjmxkmhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361474.5748644-365-97489764786449/AnsiballZ_systemd.py'
Dec 10 10:11:14 compute-0 sudo[138822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:15 compute-0 python3.9[138824]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:15 compute-0 sudo[138822]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:15 compute-0 sudo[138977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlokrtequxnqkchlhlkknwhcovwjmmnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361475.3807333-365-242092416510755/AnsiballZ_systemd.py'
Dec 10 10:11:15 compute-0 sudo[138977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:16 compute-0 python3.9[138979]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:16 compute-0 systemd[1]: Reloading.
Dec 10 10:11:16 compute-0 systemd-rc-local-generator[139010]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:16 compute-0 systemd-sysv-generator[139015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:16 compute-0 sudo[138977]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:16 compute-0 sudo[139167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjkhezqouyymnlqhcxasnwzxbpuargo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361476.5757556-401-217357852711023/AnsiballZ_systemd.py'
Dec 10 10:11:16 compute-0 sudo[139167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:17 compute-0 python3.9[139169]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 10 10:11:17 compute-0 systemd[1]: Reloading.
Dec 10 10:11:17 compute-0 systemd-rc-local-generator[139201]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:11:17 compute-0 systemd-sysv-generator[139205]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:11:17 compute-0 sshd-session[138486]: banner exchange: Connection from 195.88.120.62 port 58400: invalid format
Dec 10 10:11:17 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 10 10:11:17 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 10 10:11:17 compute-0 sudo[139167]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:18 compute-0 sudo[139361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycrzxbzqvuyqocrwvcctpvofnhbzrdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361477.7817378-409-99497873782222/AnsiballZ_systemd.py'
Dec 10 10:11:18 compute-0 sudo[139361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:18 compute-0 python3.9[139363]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:18 compute-0 sudo[139361]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:18 compute-0 sudo[139516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsdhyokxwmxveguqbdgldmmezcuylhus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361478.648369-409-33929229172985/AnsiballZ_systemd.py'
Dec 10 10:11:18 compute-0 sudo[139516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:19 compute-0 python3.9[139518]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:19 compute-0 sudo[139516]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:19 compute-0 sudo[139671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtmkjgqnermfhhjhftsorczftzhvklyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361479.4153035-409-246137912280310/AnsiballZ_systemd.py'
Dec 10 10:11:19 compute-0 sudo[139671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:20 compute-0 python3.9[139673]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:20 compute-0 sudo[139671]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:20 compute-0 sudo[139826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndaujzgffpezhvczqskuxotjwtkounkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361480.2287836-409-125458469359398/AnsiballZ_systemd.py'
Dec 10 10:11:20 compute-0 sudo[139826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:20 compute-0 python3.9[139828]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:20 compute-0 sudo[139826]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:21 compute-0 sudo[139981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koqwkiadgfpgdvnzqxyuxhwppshxpbkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361481.0942857-409-193522761586178/AnsiballZ_systemd.py'
Dec 10 10:11:21 compute-0 sudo[139981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:21 compute-0 python3.9[139983]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:21 compute-0 sudo[139981]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:22 compute-0 sudo[140136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aislpnayciyzzwovbfyctnlkvyvlltjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361481.8948176-409-9877807384925/AnsiballZ_systemd.py'
Dec 10 10:11:22 compute-0 sudo[140136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:22 compute-0 python3.9[140138]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:22 compute-0 sudo[140136]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:22 compute-0 sudo[140291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emhteyfwxaliwaxvyzbcdpvdqaphjpis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361482.6948647-409-186302175618434/AnsiballZ_systemd.py'
Dec 10 10:11:22 compute-0 sudo[140291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:23 compute-0 python3.9[140293]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:23 compute-0 sudo[140291]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:23 compute-0 sudo[140446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfvglbjtuksjtpbszqynthtyeqjqtzwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361483.4903448-409-52937524705379/AnsiballZ_systemd.py'
Dec 10 10:11:23 compute-0 sudo[140446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:24 compute-0 python3.9[140448]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:24 compute-0 sudo[140446]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:24 compute-0 sudo[140601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scakbeyjixdwawdoegwrllhxzaluitcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361484.2728853-409-48273446964688/AnsiballZ_systemd.py'
Dec 10 10:11:24 compute-0 sudo[140601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:24 compute-0 python3.9[140603]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:25 compute-0 sudo[140601]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:25 compute-0 sudo[140756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yltzgoozgqyjmuesgblfmxmfhwvehpza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361485.1879272-409-109758531563431/AnsiballZ_systemd.py'
Dec 10 10:11:25 compute-0 sudo[140756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:25 compute-0 python3.9[140758]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:25 compute-0 sudo[140756]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:26 compute-0 sudo[140911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awchiypwbucvcrbjyqmuxqvvwomirbct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361486.1347039-409-43636269898331/AnsiballZ_systemd.py'
Dec 10 10:11:26 compute-0 sudo[140911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:26 compute-0 python3.9[140913]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:26 compute-0 sudo[140911]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:27 compute-0 sudo[141066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwgwkdrphrsobwhsznrsydtbhxmhnnmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361486.9989738-409-149364141019249/AnsiballZ_systemd.py'
Dec 10 10:11:27 compute-0 sudo[141066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:27 compute-0 python3.9[141068]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:27 compute-0 sudo[141066]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:28 compute-0 sudo[141221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogvuibdtfrykksmedcxxeyoruzshfndf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361487.8885882-409-62534966116786/AnsiballZ_systemd.py'
Dec 10 10:11:28 compute-0 sudo[141221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:28 compute-0 python3.9[141223]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:28 compute-0 sudo[141221]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:29 compute-0 sudo[141376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxswjoggpmyidwggsgquqpgyxmzuyhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361488.7346263-409-230606580608634/AnsiballZ_systemd.py'
Dec 10 10:11:29 compute-0 sudo[141376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:29 compute-0 python3.9[141378]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 10 10:11:29 compute-0 sudo[141376]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:29 compute-0 sudo[141531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybyjcujrubqbdbqylecfswvzkwfifvit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361489.7025845-511-94732356592685/AnsiballZ_file.py'
Dec 10 10:11:29 compute-0 sudo[141531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:30 compute-0 python3.9[141533]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:30 compute-0 sudo[141531]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:30 compute-0 sudo[141683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzhmxllrrmspqpkbhtbrgxtrykobwbrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361490.3535483-511-132069562972380/AnsiballZ_file.py'
Dec 10 10:11:30 compute-0 sudo[141683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:30 compute-0 python3.9[141685]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:30 compute-0 sudo[141683]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:31 compute-0 sudo[141835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjifncwxnqiplkolqrltawizafazxdjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361490.9447832-511-104076966237182/AnsiballZ_file.py'
Dec 10 10:11:31 compute-0 sudo[141835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:31 compute-0 python3.9[141837]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:31 compute-0 sudo[141835]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:11:31.447 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:11:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:11:31.448 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:11:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:11:31.449 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:11:31 compute-0 sudo[141987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czlqtrkkglcvukahmywgasnjleykxvrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361491.5480776-511-78624556922768/AnsiballZ_file.py'
Dec 10 10:11:31 compute-0 sudo[141987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:31 compute-0 python3.9[141989]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:31 compute-0 sudo[141987]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:32 compute-0 sudo[142139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egrnixgtemngiguerjdeoavejkzdbkts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361492.137781-511-62849951137268/AnsiballZ_file.py'
Dec 10 10:11:32 compute-0 sudo[142139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:32 compute-0 python3.9[142141]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:32 compute-0 sudo[142139]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:33 compute-0 sudo[142291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiqajmyzuvykrmozvyrxjhveocrnuvhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361492.7710047-511-214670677093224/AnsiballZ_file.py'
Dec 10 10:11:33 compute-0 sudo[142291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:33 compute-0 python3.9[142293]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:11:33 compute-0 sudo[142291]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:33 compute-0 sudo[142443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiimyrzpjzeeyafaqoiqddjlsiwvxsco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361493.4693356-554-93245127977720/AnsiballZ_stat.py'
Dec 10 10:11:33 compute-0 sudo[142443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:34 compute-0 python3.9[142445]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:34 compute-0 sudo[142443]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:34 compute-0 sudo[142581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufdcleohsistevyeiugpirwocopslxml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361493.4693356-554-93245127977720/AnsiballZ_copy.py'
Dec 10 10:11:34 compute-0 sudo[142581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:34 compute-0 podman[142542]: 2025-12-10 10:11:34.610419364 +0000 UTC m=+0.051498569 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:11:34 compute-0 python3.9[142588]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361493.4693356-554-93245127977720/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:34 compute-0 sudo[142581]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:35 compute-0 sudo[142740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jewuexrmfomsexcqhvmudpdvjvcosbfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361494.9147384-554-236768771245898/AnsiballZ_stat.py'
Dec 10 10:11:35 compute-0 sudo[142740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:35 compute-0 python3.9[142742]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:35 compute-0 sudo[142740]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:36 compute-0 sudo[142865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiudfqmkoeohuykfdfpslmboppotepqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361494.9147384-554-236768771245898/AnsiballZ_copy.py'
Dec 10 10:11:36 compute-0 sudo[142865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:36 compute-0 python3.9[142867]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361494.9147384-554-236768771245898/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:36 compute-0 sudo[142865]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:36 compute-0 sudo[143017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pngjykifflnjlacutxxivylfzmdhubaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361496.4309773-554-206499880056484/AnsiballZ_stat.py'
Dec 10 10:11:36 compute-0 sudo[143017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:36 compute-0 python3.9[143019]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:36 compute-0 sudo[143017]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:37 compute-0 sudo[143142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwmcplmpcxzyltqztrrqkupkzhurram ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361496.4309773-554-206499880056484/AnsiballZ_copy.py'
Dec 10 10:11:37 compute-0 sudo[143142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:37 compute-0 python3.9[143144]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361496.4309773-554-206499880056484/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:37 compute-0 sudo[143142]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:37 compute-0 sudo[143294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrbcxhrgfxftacvbnsaecqhppfvvbkyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361497.6368988-554-189939850398409/AnsiballZ_stat.py'
Dec 10 10:11:37 compute-0 sudo[143294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:38 compute-0 python3.9[143296]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:38 compute-0 sudo[143294]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:38 compute-0 sudo[143419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzqtswofuxxsiebccuycmuymmiprtfiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361497.6368988-554-189939850398409/AnsiballZ_copy.py'
Dec 10 10:11:38 compute-0 sudo[143419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:38 compute-0 python3.9[143421]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361497.6368988-554-189939850398409/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:38 compute-0 sudo[143419]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:39 compute-0 sudo[143571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rujntcsuofmsaukgqsrutqainkuankot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361498.8207653-554-6678696200599/AnsiballZ_stat.py'
Dec 10 10:11:39 compute-0 sudo[143571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:39 compute-0 python3.9[143573]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:39 compute-0 sudo[143571]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:39 compute-0 sudo[143696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaqhmwucgpfidpgsjwxchcflalkvsmah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361498.8207653-554-6678696200599/AnsiballZ_copy.py'
Dec 10 10:11:39 compute-0 sudo[143696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:39 compute-0 podman[143698]: 2025-12-10 10:11:39.855585107 +0000 UTC m=+0.116435022 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 10 10:11:39 compute-0 python3.9[143699]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361498.8207653-554-6678696200599/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:39 compute-0 sudo[143696]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:40 compute-0 sudo[143872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koajmedzvjeqffpsyfgssoblqpncnahg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361500.0957332-554-152265421537306/AnsiballZ_stat.py'
Dec 10 10:11:40 compute-0 sudo[143872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:40 compute-0 python3.9[143874]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:40 compute-0 sudo[143872]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:40 compute-0 sudo[143997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvuhtxsfgvlucclmpectakkalxjoltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361500.0957332-554-152265421537306/AnsiballZ_copy.py'
Dec 10 10:11:40 compute-0 sudo[143997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:41 compute-0 python3.9[143999]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361500.0957332-554-152265421537306/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:41 compute-0 sudo[143997]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:41 compute-0 sudo[144149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfrznrcvbkrtlosqvthwvxnvwizzbpcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361501.6622758-554-159634239550439/AnsiballZ_stat.py'
Dec 10 10:11:41 compute-0 sudo[144149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:42 compute-0 python3.9[144151]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:42 compute-0 sudo[144149]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:42 compute-0 sudo[144272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hulysggubeeauqgtyqxhixqrswwlzdvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361501.6622758-554-159634239550439/AnsiballZ_copy.py'
Dec 10 10:11:42 compute-0 sudo[144272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:42 compute-0 python3.9[144274]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361501.6622758-554-159634239550439/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:42 compute-0 sudo[144272]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:43 compute-0 sudo[144424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghuzehcngxfbmregjeuumpcjtgxefxfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361502.8767927-554-54848210589882/AnsiballZ_stat.py'
Dec 10 10:11:43 compute-0 sudo[144424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:43 compute-0 python3.9[144426]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:43 compute-0 sudo[144424]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:43 compute-0 sudo[144549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aakhmxjrzqfvmiyjsyvywqfglcnahjfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361502.8767927-554-54848210589882/AnsiballZ_copy.py'
Dec 10 10:11:43 compute-0 sudo[144549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:43 compute-0 python3.9[144551]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765361502.8767927-554-54848210589882/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:43 compute-0 sudo[144549]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:44 compute-0 sudo[144701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnbmjkmerscpcoxtluivwhlxpqvjqrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361504.0944035-667-224570375770869/AnsiballZ_command.py'
Dec 10 10:11:44 compute-0 sudo[144701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:44 compute-0 python3.9[144703]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 10 10:11:44 compute-0 sudo[144701]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:45 compute-0 sudo[144854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kogqjdksmkgthvzyknbykizdwwbhdnzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361504.78142-676-163547105985052/AnsiballZ_file.py'
Dec 10 10:11:45 compute-0 sudo[144854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:45 compute-0 python3.9[144856]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:45 compute-0 sudo[144854]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:45 compute-0 sudo[145006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlfbnmhizgezjaptrwmqisbgvdqumzfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361505.3969607-676-72129885381432/AnsiballZ_file.py'
Dec 10 10:11:45 compute-0 sudo[145006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:45 compute-0 python3.9[145008]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:45 compute-0 sudo[145006]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:46 compute-0 sudo[145158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtgdvfjzdjrwhhozunrferncrhgxkpws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361505.970688-676-220132711726111/AnsiballZ_file.py'
Dec 10 10:11:46 compute-0 sudo[145158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:46 compute-0 python3.9[145160]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:46 compute-0 sudo[145158]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:46 compute-0 sudo[145310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnmhczaombknpewlgosxvwuvklaypqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361506.5959418-676-90690833827203/AnsiballZ_file.py'
Dec 10 10:11:46 compute-0 sudo[145310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:47 compute-0 python3.9[145312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:47 compute-0 sudo[145310]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:47 compute-0 sudo[145462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiayacpambnxjtijdowtjpvcpvitlvdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361507.22404-676-9878750805708/AnsiballZ_file.py'
Dec 10 10:11:47 compute-0 sudo[145462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:47 compute-0 python3.9[145464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:47 compute-0 sudo[145462]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:48 compute-0 sudo[145614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eziuzersmwcwxdaizjcmwuxuecmcjtvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361507.8263252-676-104789953232974/AnsiballZ_file.py'
Dec 10 10:11:48 compute-0 sudo[145614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:48 compute-0 python3.9[145616]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:48 compute-0 sudo[145614]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:48 compute-0 sudo[145766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtutxddqtydkjemuomymptrvomacbnod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361508.417469-676-249227416626466/AnsiballZ_file.py'
Dec 10 10:11:48 compute-0 sudo[145766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:48 compute-0 python3.9[145768]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:48 compute-0 sudo[145766]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:49 compute-0 sudo[145918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksyrltrolfqrbnpiywrhalaviirhczwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361509.003583-676-63406514002954/AnsiballZ_file.py'
Dec 10 10:11:49 compute-0 sudo[145918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:49 compute-0 python3.9[145920]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:49 compute-0 sudo[145918]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:49 compute-0 sudo[146070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdlfexwszkpzgvpnkxfmlgfnqarqdpxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361509.620294-676-108242892844082/AnsiballZ_file.py'
Dec 10 10:11:49 compute-0 sudo[146070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:50 compute-0 python3.9[146072]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:50 compute-0 sudo[146070]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:50 compute-0 sudo[146222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huofgslwnisfqojfhalqeoslwpltwisk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361510.205469-676-42893996034659/AnsiballZ_file.py'
Dec 10 10:11:50 compute-0 sudo[146222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:50 compute-0 python3.9[146224]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:50 compute-0 sudo[146222]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:51 compute-0 sudo[146374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrpgbtnkdrhcdvigktyyaqrwhcivwzbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361510.799424-676-198224782271562/AnsiballZ_file.py'
Dec 10 10:11:51 compute-0 sudo[146374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:51 compute-0 python3.9[146376]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:51 compute-0 sudo[146374]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:51 compute-0 sudo[146526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpuenrlkmsxaatoprmbjuvsgctpxwutc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361511.3948603-676-221189464511493/AnsiballZ_file.py'
Dec 10 10:11:51 compute-0 sudo[146526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:51 compute-0 python3.9[146528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:51 compute-0 sudo[146526]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:52 compute-0 sudo[146678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roirrbmvohfretcyiwfvwnvqptgkggds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361512.0136855-676-115692420390336/AnsiballZ_file.py'
Dec 10 10:11:52 compute-0 sudo[146678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:52 compute-0 python3.9[146680]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:52 compute-0 sudo[146678]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:52 compute-0 sudo[146830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfdocbspucseivixlxhcvjsdykrnuiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361512.6436918-676-64721990366615/AnsiballZ_file.py'
Dec 10 10:11:52 compute-0 sudo[146830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:53 compute-0 python3.9[146832]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:53 compute-0 sudo[146830]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:53 compute-0 sudo[146982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkwokvyciystdywwxsphxxveatqnbdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361513.2916567-775-267821413974582/AnsiballZ_stat.py'
Dec 10 10:11:53 compute-0 sudo[146982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:53 compute-0 python3.9[146984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:53 compute-0 sudo[146982]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:54 compute-0 sudo[147105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvwkwsdaanbqzhhgsojfyyvlqyowbqsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361513.2916567-775-267821413974582/AnsiballZ_copy.py'
Dec 10 10:11:54 compute-0 sudo[147105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:54 compute-0 python3.9[147107]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361513.2916567-775-267821413974582/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:54 compute-0 sudo[147105]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:54 compute-0 sudo[147257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmwepusbdtgpwddflobipypoucieujt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361514.4307501-775-78964814626540/AnsiballZ_stat.py'
Dec 10 10:11:54 compute-0 sudo[147257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:55 compute-0 python3.9[147259]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:55 compute-0 sudo[147257]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:55 compute-0 sudo[147380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufsbnwxaidzfnwwrimabcvktnydaxueo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361514.4307501-775-78964814626540/AnsiballZ_copy.py'
Dec 10 10:11:55 compute-0 sudo[147380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:55 compute-0 python3.9[147382]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361514.4307501-775-78964814626540/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:55 compute-0 sudo[147380]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:56 compute-0 sudo[147532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoefsjafhrxibtmarqvmqlegnvfkjgqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361515.7895591-775-177872554008188/AnsiballZ_stat.py'
Dec 10 10:11:56 compute-0 sudo[147532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:56 compute-0 python3.9[147534]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:56 compute-0 sudo[147532]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:56 compute-0 sudo[147655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jesadptwdbfvwlzerlmvjdjnhdaqokqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361515.7895591-775-177872554008188/AnsiballZ_copy.py'
Dec 10 10:11:56 compute-0 sudo[147655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:56 compute-0 python3.9[147657]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361515.7895591-775-177872554008188/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:56 compute-0 sudo[147655]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:57 compute-0 sudo[147807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergjollaluucblehqeffbznwwdbncsui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361516.9216013-775-151121807926521/AnsiballZ_stat.py'
Dec 10 10:11:57 compute-0 sudo[147807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:57 compute-0 python3.9[147809]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:57 compute-0 sudo[147807]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:57 compute-0 sudo[147930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glteynocnsdyoplyefrjzmmimlvoxrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361516.9216013-775-151121807926521/AnsiballZ_copy.py'
Dec 10 10:11:57 compute-0 sudo[147930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:58 compute-0 python3.9[147932]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361516.9216013-775-151121807926521/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:58 compute-0 sudo[147930]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:58 compute-0 sudo[148082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeszqciyrtnchjgthsesukmusmgckfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361518.159756-775-69646393504601/AnsiballZ_stat.py'
Dec 10 10:11:58 compute-0 sudo[148082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:58 compute-0 python3.9[148084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:58 compute-0 sudo[148082]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:59 compute-0 sudo[148205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipjexeigrulfipovjhljlygzsrwfgosl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361518.159756-775-69646393504601/AnsiballZ_copy.py'
Dec 10 10:11:59 compute-0 sudo[148205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:59 compute-0 python3.9[148207]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361518.159756-775-69646393504601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:11:59 compute-0 sudo[148205]: pam_unix(sudo:session): session closed for user root
Dec 10 10:11:59 compute-0 sudo[148357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isbtyhfimvdvmxmgshqxrtipbvniysuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361519.4064887-775-54581763710078/AnsiballZ_stat.py'
Dec 10 10:11:59 compute-0 sudo[148357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:11:59 compute-0 python3.9[148359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:11:59 compute-0 sudo[148357]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:00 compute-0 sudo[148480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxxpaduxeykyjaoausclyiwefzzhbmza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361519.4064887-775-54581763710078/AnsiballZ_copy.py'
Dec 10 10:12:00 compute-0 sudo[148480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:00 compute-0 python3.9[148482]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361519.4064887-775-54581763710078/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:00 compute-0 sudo[148480]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:00 compute-0 sudo[148632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkpyqpajuwpharuijxrdjvfbncppujxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361520.5998456-775-52287571025923/AnsiballZ_stat.py'
Dec 10 10:12:00 compute-0 sudo[148632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:01 compute-0 python3.9[148634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:01 compute-0 sudo[148632]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:01 compute-0 sudo[148755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yueiumhjiebpjmborcyaggjuvejsobtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361520.5998456-775-52287571025923/AnsiballZ_copy.py'
Dec 10 10:12:01 compute-0 sudo[148755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:01 compute-0 python3.9[148757]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361520.5998456-775-52287571025923/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:01 compute-0 sudo[148755]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:02 compute-0 sudo[148907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjrimfcfymdxcxkirayizehawlsdfucg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361521.8550122-775-240150721573148/AnsiballZ_stat.py'
Dec 10 10:12:02 compute-0 sudo[148907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:02 compute-0 python3.9[148909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:03 compute-0 sudo[148907]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:03 compute-0 sudo[149030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etntgakygzzgvvhiqvklzpsaudybobxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361521.8550122-775-240150721573148/AnsiballZ_copy.py'
Dec 10 10:12:03 compute-0 sudo[149030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:03 compute-0 python3.9[149032]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361521.8550122-775-240150721573148/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:03 compute-0 sudo[149030]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:04 compute-0 sudo[149182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjcwchkdsyzbrfqddsncxzvjfvwiugtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361523.8526773-775-225682237491431/AnsiballZ_stat.py'
Dec 10 10:12:04 compute-0 sudo[149182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:04 compute-0 python3.9[149184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:04 compute-0 sudo[149182]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:04 compute-0 sudo[149305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydfeqxrtizjipjhawyvbmrmxiklvdqxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361523.8526773-775-225682237491431/AnsiballZ_copy.py'
Dec 10 10:12:04 compute-0 sudo[149305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:04 compute-0 podman[149307]: 2025-12-10 10:12:04.755639345 +0000 UTC m=+0.063954967 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:12:04 compute-0 python3.9[149308]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361523.8526773-775-225682237491431/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:04 compute-0 sudo[149305]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:05 compute-0 sudo[149476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyldbtklhunsrhejtqotvgrqovxauibw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361525.0462403-775-53185166640328/AnsiballZ_stat.py'
Dec 10 10:12:05 compute-0 sudo[149476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:05 compute-0 python3.9[149478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:05 compute-0 sudo[149476]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:05 compute-0 sudo[149599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zumtcdggvxcifrndjlrxnmrmdykozvof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361525.0462403-775-53185166640328/AnsiballZ_copy.py'
Dec 10 10:12:05 compute-0 sudo[149599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:06 compute-0 python3.9[149601]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361525.0462403-775-53185166640328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:06 compute-0 sudo[149599]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:06 compute-0 sudo[149751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xabcuixylybubwjuriipsclzkvegcmes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361526.3484704-775-13296294009153/AnsiballZ_stat.py'
Dec 10 10:12:06 compute-0 sudo[149751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:06 compute-0 python3.9[149753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:06 compute-0 sudo[149751]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:07 compute-0 sudo[149874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bevjptouwfgdiwzzbkqzyhtglqslcdyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361526.3484704-775-13296294009153/AnsiballZ_copy.py'
Dec 10 10:12:07 compute-0 sudo[149874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:07 compute-0 python3.9[149876]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361526.3484704-775-13296294009153/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:07 compute-0 sudo[149874]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:07 compute-0 sudo[150026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcyoskvrmtiuaevurdwlcuvoxfvfgbtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361527.4618793-775-193983089392558/AnsiballZ_stat.py'
Dec 10 10:12:07 compute-0 sudo[150026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:07 compute-0 python3.9[150028]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:07 compute-0 sudo[150026]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:08 compute-0 sudo[150149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zswujgfpsixrdoqacvcsghfytqceiesr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361527.4618793-775-193983089392558/AnsiballZ_copy.py'
Dec 10 10:12:08 compute-0 sudo[150149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:08 compute-0 python3.9[150151]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361527.4618793-775-193983089392558/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:08 compute-0 sudo[150149]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:08 compute-0 sudo[150301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfurrhqhqgxjeoxlhjxwwsgkvvgmxgbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361528.6618335-775-115302078300950/AnsiballZ_stat.py'
Dec 10 10:12:08 compute-0 sudo[150301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:09 compute-0 python3.9[150303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:09 compute-0 sudo[150301]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:09 compute-0 sudo[150424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksefbgxntdfviycdoesdanumshzcyhbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361528.6618335-775-115302078300950/AnsiballZ_copy.py'
Dec 10 10:12:09 compute-0 sudo[150424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:09 compute-0 python3.9[150426]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361528.6618335-775-115302078300950/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:09 compute-0 sudo[150424]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:10 compute-0 podman[150474]: 2025-12-10 10:12:10.056806433 +0000 UTC m=+0.084924991 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:12:10 compute-0 sudo[150602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wughdfujhpfdalpqtunyndovmxlwnhby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361529.86559-775-143185458636355/AnsiballZ_stat.py'
Dec 10 10:12:10 compute-0 sudo[150602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:10 compute-0 python3.9[150604]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:10 compute-0 sudo[150602]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:10 compute-0 sudo[150725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyljwybknvjxtesvnsjeytwowuitdqho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361529.86559-775-143185458636355/AnsiballZ_copy.py'
Dec 10 10:12:10 compute-0 sudo[150725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:11 compute-0 python3.9[150727]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361529.86559-775-143185458636355/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:11 compute-0 sudo[150725]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:11 compute-0 python3.9[150877]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:12:12 compute-0 sudo[151030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovrdmazdmhhtxkeenptvjjytlelffkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361531.901801-981-172559124025208/AnsiballZ_seboolean.py'
Dec 10 10:12:12 compute-0 sudo[151030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:12 compute-0 python3.9[151032]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 10 10:12:13 compute-0 sudo[151030]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:14 compute-0 sudo[151186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omdndwnjyuittixtzbqjlqlwteuugicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361533.895961-989-195803032183384/AnsiballZ_copy.py'
Dec 10 10:12:14 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 10 10:12:14 compute-0 sudo[151186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:14 compute-0 python3.9[151188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:14 compute-0 sudo[151186]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:14 compute-0 sudo[151338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cutmniidzgywzkyryobuglbnbjykdknr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361534.5354025-989-83923093410271/AnsiballZ_copy.py'
Dec 10 10:12:14 compute-0 sudo[151338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:15 compute-0 python3.9[151340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:15 compute-0 sudo[151338]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:15 compute-0 sudo[151490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcawboalfkumhogbwudlqpqtemsandit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361535.2044153-989-257005319956887/AnsiballZ_copy.py'
Dec 10 10:12:15 compute-0 sudo[151490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:15 compute-0 python3.9[151492]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:15 compute-0 sudo[151490]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:16 compute-0 sudo[151642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irtjxfvyahjnqnxjjlvkywvkevqozlia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361535.8435237-989-39446620078514/AnsiballZ_copy.py'
Dec 10 10:12:16 compute-0 sudo[151642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:16 compute-0 python3.9[151644]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:16 compute-0 sudo[151642]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:16 compute-0 sudo[151794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odfxjjhppyavuhdsskcfazaexnswwdox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361536.4771392-989-145738588471071/AnsiballZ_copy.py'
Dec 10 10:12:16 compute-0 sudo[151794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:17 compute-0 python3.9[151796]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:17 compute-0 sudo[151794]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:17 compute-0 sudo[151946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hutfrxajdnsqbjnjkylhmacvntvwlrvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361537.215152-1025-119658597918267/AnsiballZ_copy.py'
Dec 10 10:12:17 compute-0 sudo[151946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:17 compute-0 python3.9[151948]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:17 compute-0 sudo[151946]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:18 compute-0 sudo[152098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuspycjxehppyhblcxnkkjevuugyvjmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361537.8461614-1025-179153323153102/AnsiballZ_copy.py'
Dec 10 10:12:18 compute-0 sudo[152098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:18 compute-0 python3.9[152100]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:18 compute-0 sudo[152098]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:18 compute-0 sudo[152250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnnlqglmyjciolfqckyjkoqdqnqhporw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361538.4962633-1025-230248393996310/AnsiballZ_copy.py'
Dec 10 10:12:18 compute-0 sudo[152250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:19 compute-0 python3.9[152252]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:19 compute-0 sudo[152250]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:19 compute-0 sudo[152402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edycfhswxlshlbsaehwtcrjmxbpsuypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361539.1951833-1025-5833474496971/AnsiballZ_copy.py'
Dec 10 10:12:19 compute-0 sudo[152402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:19 compute-0 python3.9[152404]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:19 compute-0 sudo[152402]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:20 compute-0 sudo[152554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysyuqpjjwtmpydpfgkmsibqebmygtocb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361539.7968144-1025-133746867495592/AnsiballZ_copy.py'
Dec 10 10:12:20 compute-0 sudo[152554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:20 compute-0 python3.9[152556]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:20 compute-0 sudo[152554]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:20 compute-0 sudo[152706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myrafdefxvjbsoyuielpwpipobrmabgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361540.5126693-1061-72829311219235/AnsiballZ_systemd.py'
Dec 10 10:12:20 compute-0 sudo[152706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:21 compute-0 python3.9[152708]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:12:21 compute-0 systemd[1]: Reloading.
Dec 10 10:12:21 compute-0 systemd-rc-local-generator[152732]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:21 compute-0 systemd-sysv-generator[152738]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:21 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 10 10:12:21 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 10 10:12:21 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 10 10:12:21 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 10 10:12:21 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 10 10:12:21 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 10 10:12:21 compute-0 sudo[152706]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:21 compute-0 sudo[152899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xepehvalnhkilmjtvuphotrnxudpshug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361541.6873183-1061-182024204318803/AnsiballZ_systemd.py'
Dec 10 10:12:21 compute-0 sudo[152899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:22 compute-0 python3.9[152901]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:12:22 compute-0 systemd[1]: Reloading.
Dec 10 10:12:22 compute-0 systemd-sysv-generator[152933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:22 compute-0 systemd-rc-local-generator[152929]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:22 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 10 10:12:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 10 10:12:22 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 10 10:12:22 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 10 10:12:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 10 10:12:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 10 10:12:22 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 10 10:12:22 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 10 10:12:22 compute-0 sudo[152899]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:23 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 10 10:12:23 compute-0 sudo[153116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yschjfgpujvkivirthnhuuqcdkkmcjeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361542.8033628-1061-110175741818698/AnsiballZ_systemd.py'
Dec 10 10:12:23 compute-0 sudo[153116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:23 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 10 10:12:23 compute-0 python3.9[153118]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:12:23 compute-0 systemd[1]: Reloading.
Dec 10 10:12:23 compute-0 systemd-rc-local-generator[153140]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:23 compute-0 systemd-sysv-generator[153143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:23 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 10 10:12:23 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 10 10:12:23 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 10 10:12:23 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 10 10:12:23 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 10 10:12:23 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 10 10:12:23 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 10 10:12:23 compute-0 sudo[153116]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:23 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 10 10:12:24 compute-0 sudo[153336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-milqdaypywvpqdyphwygtxceejmohpkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361543.918572-1061-265110518178428/AnsiballZ_systemd.py'
Dec 10 10:12:24 compute-0 sudo[153336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:24 compute-0 python3.9[153338]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:12:24 compute-0 systemd[1]: Reloading.
Dec 10 10:12:24 compute-0 systemd-sysv-generator[153368]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:24 compute-0 systemd-rc-local-generator[153364]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:24 compute-0 setroubleshoot[153089]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4676e585-3d88-459a-b049-3a03d937ecd7
Dec 10 10:12:24 compute-0 setroubleshoot[153089]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 10 10:12:24 compute-0 setroubleshoot[153089]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4676e585-3d88-459a-b049-3a03d937ecd7
Dec 10 10:12:24 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 10 10:12:24 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 10 10:12:24 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 10 10:12:24 compute-0 setroubleshoot[153089]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 10 10:12:24 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 10 10:12:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 10 10:12:24 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 10 10:12:24 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 10 10:12:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 10 10:12:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 10 10:12:24 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 10 10:12:24 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 10 10:12:24 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 10 10:12:24 compute-0 sudo[153336]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:25 compute-0 sudo[153552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnxqminojnxhxknjfpwhaythcirtwaly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361545.0126889-1061-253350191679039/AnsiballZ_systemd.py'
Dec 10 10:12:25 compute-0 sudo[153552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:25 compute-0 python3.9[153554]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:12:25 compute-0 systemd[1]: Reloading.
Dec 10 10:12:25 compute-0 systemd-rc-local-generator[153583]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:25 compute-0 systemd-sysv-generator[153586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:25 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 10 10:12:25 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 10 10:12:25 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 10 10:12:25 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 10 10:12:25 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 10 10:12:25 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 10 10:12:25 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 10 10:12:26 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 10 10:12:26 compute-0 sudo[153552]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:26 compute-0 sudo[153764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztvczljfylxbneifwgnabnfeothmotum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361546.332858-1098-12528464201164/AnsiballZ_file.py'
Dec 10 10:12:26 compute-0 sudo[153764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:26 compute-0 python3.9[153766]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:26 compute-0 sudo[153764]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:27 compute-0 sudo[153916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfaszdgqizmzsixzrmkaimfanqvznoyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361547.026079-1106-171956598141734/AnsiballZ_find.py'
Dec 10 10:12:27 compute-0 sudo[153916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:27 compute-0 python3.9[153918]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:12:27 compute-0 sudo[153916]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:28 compute-0 sudo[154068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iifajhophpvfdkcuscbyyuksdqimkjzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361547.8668337-1120-148300142072815/AnsiballZ_stat.py'
Dec 10 10:12:28 compute-0 sudo[154068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:28 compute-0 python3.9[154070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:28 compute-0 sudo[154068]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:28 compute-0 sudo[154191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fodwrjwgnepsgxejngbkyhehrczwgtpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361547.8668337-1120-148300142072815/AnsiballZ_copy.py'
Dec 10 10:12:28 compute-0 sudo[154191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:28 compute-0 python3.9[154193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361547.8668337-1120-148300142072815/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:28 compute-0 sudo[154191]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:29 compute-0 sudo[154343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hflxfdrqvfdglhjwdmvxpncstrkjewjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361549.2013476-1136-73949366478239/AnsiballZ_file.py'
Dec 10 10:12:29 compute-0 sudo[154343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:29 compute-0 python3.9[154345]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:29 compute-0 sudo[154343]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:30 compute-0 sudo[154495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blzschkvnmfgbrmrjksxdzyrrfdciida ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361549.8737445-1144-196904103699749/AnsiballZ_stat.py'
Dec 10 10:12:30 compute-0 sudo[154495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:30 compute-0 python3.9[154497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:30 compute-0 sudo[154495]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:30 compute-0 sudo[154573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttnzqvtxyggeahdlzehcwhnoownlkwpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361549.8737445-1144-196904103699749/AnsiballZ_file.py'
Dec 10 10:12:30 compute-0 sudo[154573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:30 compute-0 python3.9[154575]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:30 compute-0 sudo[154573]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:31 compute-0 sudo[154725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxgupcfyrtzrtouoxtlyczqzfbfomeaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361550.9501204-1156-30711629120218/AnsiballZ_stat.py'
Dec 10 10:12:31 compute-0 sudo[154725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:31 compute-0 python3.9[154727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:12:31.452 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:12:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:12:31.454 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:12:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:12:31.454 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:12:31 compute-0 sudo[154725]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:31 compute-0 sudo[154803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkbtvsnrjmkdvqukpdevuamilvlugytu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361550.9501204-1156-30711629120218/AnsiballZ_file.py'
Dec 10 10:12:31 compute-0 sudo[154803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:31 compute-0 python3.9[154805]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.6t1kkqb9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:31 compute-0 sudo[154803]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:32 compute-0 sudo[154955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvhvdrctpwjipahxmsvpupwwhotirodr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361552.1001756-1168-173136383566739/AnsiballZ_stat.py'
Dec 10 10:12:32 compute-0 sudo[154955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:32 compute-0 python3.9[154957]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:32 compute-0 sudo[154955]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:32 compute-0 sudo[155033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrbzotdcdkguujijwxbsmosnevzehzfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361552.1001756-1168-173136383566739/AnsiballZ_file.py'
Dec 10 10:12:32 compute-0 sudo[155033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:33 compute-0 python3.9[155035]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:33 compute-0 sudo[155033]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:33 compute-0 sudo[155185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivnouswintytulywtkrygushfkfwudaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361553.2959976-1181-96147310660730/AnsiballZ_command.py'
Dec 10 10:12:33 compute-0 sudo[155185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:33 compute-0 python3.9[155187]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:12:33 compute-0 sudo[155185]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:34 compute-0 sudo[155338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czaoceedwcaxfltttvitmwfjxdmhrlee ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361554.1689487-1189-274084476851131/AnsiballZ_edpm_nftables_from_files.py'
Dec 10 10:12:34 compute-0 sudo[155338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:34 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 10 10:12:34 compute-0 python3[155340]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 10 10:12:34 compute-0 sudo[155338]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:34 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 10 10:12:34 compute-0 podman[155341]: 2025-12-10 10:12:34.903779204 +0000 UTC m=+0.055549467 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 10 10:12:35 compute-0 sudo[155510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkkfwrvppjcxohupytytqtkaxpfxsbkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361555.0034053-1197-234668092140782/AnsiballZ_stat.py'
Dec 10 10:12:35 compute-0 sudo[155510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:35 compute-0 python3.9[155512]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:35 compute-0 sudo[155510]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:35 compute-0 sudo[155588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfojtxskfjohctfdtziabcsfclnkvqtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361555.0034053-1197-234668092140782/AnsiballZ_file.py'
Dec 10 10:12:35 compute-0 sudo[155588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:35 compute-0 python3.9[155590]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:35 compute-0 sudo[155588]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:36 compute-0 sudo[155740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noygfyrkiqunglyoyozcfvdvaypxdyle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361556.1210492-1209-191513120726275/AnsiballZ_stat.py'
Dec 10 10:12:36 compute-0 sudo[155740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:36 compute-0 python3.9[155742]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:36 compute-0 sudo[155740]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:36 compute-0 sudo[155818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trcymwjsjrtgowghxyugvisbgmlcrfek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361556.1210492-1209-191513120726275/AnsiballZ_file.py'
Dec 10 10:12:36 compute-0 sudo[155818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:37 compute-0 python3.9[155820]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:37 compute-0 sudo[155818]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:37 compute-0 sudo[155970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfqcbouudgghegtfgwfncromjexcqdvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361557.2121625-1221-49612177815164/AnsiballZ_stat.py'
Dec 10 10:12:37 compute-0 sudo[155970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:37 compute-0 python3.9[155972]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:37 compute-0 sudo[155970]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:37 compute-0 sudo[156048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxbbyioqnxntdgumrwncoomjpjpzfsoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361557.2121625-1221-49612177815164/AnsiballZ_file.py'
Dec 10 10:12:37 compute-0 sudo[156048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:38 compute-0 python3.9[156050]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:38 compute-0 sudo[156048]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:38 compute-0 sudo[156200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyjgwexxccwgbqopskxkxymljteibeyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361558.3119915-1233-53639328841841/AnsiballZ_stat.py'
Dec 10 10:12:38 compute-0 sudo[156200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:38 compute-0 python3.9[156202]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:38 compute-0 sudo[156200]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:39 compute-0 sudo[156278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkxjfwxcgekrokwpkzwagddmfkijcxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361558.3119915-1233-53639328841841/AnsiballZ_file.py'
Dec 10 10:12:39 compute-0 sudo[156278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:39 compute-0 python3.9[156280]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:39 compute-0 sudo[156278]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:39 compute-0 sudo[156430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vukrpyyjemccvvupitpgnjloeaxdjlay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361559.4758666-1245-116614848039974/AnsiballZ_stat.py'
Dec 10 10:12:39 compute-0 sudo[156430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:40 compute-0 python3.9[156432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:40 compute-0 sudo[156430]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:40 compute-0 sudo[156570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhcijmhtkergszzgglkasuttoszirtgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361559.4758666-1245-116614848039974/AnsiballZ_copy.py'
Dec 10 10:12:40 compute-0 sudo[156570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:40 compute-0 podman[156529]: 2025-12-10 10:12:40.447881709 +0000 UTC m=+0.092639870 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:12:40 compute-0 python3.9[156575]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361559.4758666-1245-116614848039974/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:40 compute-0 sudo[156570]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:41 compute-0 sudo[156731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpzlqwklncwkcdjzkeinhjvqrzwrbpyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361560.879432-1260-251480951372758/AnsiballZ_file.py'
Dec 10 10:12:41 compute-0 sudo[156731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:41 compute-0 python3.9[156733]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:41 compute-0 sudo[156731]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:41 compute-0 sudo[156883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quxxyworfowgdkhwevsxpnjedmjnijtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361561.5788116-1268-23872159367133/AnsiballZ_command.py'
Dec 10 10:12:41 compute-0 sudo[156883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:42 compute-0 python3.9[156885]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:12:42 compute-0 sudo[156883]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:42 compute-0 sudo[157038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwrsyszpbuybceatkblblxdimgenoztx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361562.2920675-1276-270051304122191/AnsiballZ_blockinfile.py'
Dec 10 10:12:42 compute-0 sudo[157038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:42 compute-0 python3.9[157040]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:42 compute-0 sudo[157038]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:43 compute-0 sudo[157190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfxuipgctiuoooiimbzvigxsjszwoycg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361563.1557689-1285-36990220660725/AnsiballZ_command.py'
Dec 10 10:12:43 compute-0 sudo[157190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:43 compute-0 python3.9[157192]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:12:43 compute-0 sudo[157190]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:44 compute-0 sudo[157343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohoqxvcwacadblqtgmsjjnrviqriyzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361563.8206527-1293-21904114093891/AnsiballZ_stat.py'
Dec 10 10:12:44 compute-0 sudo[157343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:44 compute-0 python3.9[157345]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:12:44 compute-0 sudo[157343]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:44 compute-0 sudo[157497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdeahpkutmhmwqvvikbjsztosgegjxox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361564.4687483-1301-60272729113047/AnsiballZ_command.py'
Dec 10 10:12:44 compute-0 sudo[157497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:44 compute-0 python3.9[157499]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:12:45 compute-0 sudo[157497]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:45 compute-0 sudo[157652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zueaznulmdujybmgdqzcsdzmzmetwpmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361565.1664708-1309-238520263524825/AnsiballZ_file.py'
Dec 10 10:12:45 compute-0 sudo[157652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:45 compute-0 python3.9[157654]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:45 compute-0 sudo[157652]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:46 compute-0 sudo[157804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhtnfgjthrizpsvxrbugyenophpqjuvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361565.9380999-1317-18057772148190/AnsiballZ_stat.py'
Dec 10 10:12:46 compute-0 sudo[157804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:46 compute-0 python3.9[157806]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:46 compute-0 sudo[157804]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:46 compute-0 sudo[157927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rupjjowqltjnljiahqettxoeevxbajkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361565.9380999-1317-18057772148190/AnsiballZ_copy.py'
Dec 10 10:12:46 compute-0 sudo[157927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:47 compute-0 python3.9[157929]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361565.9380999-1317-18057772148190/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:47 compute-0 sudo[157927]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:47 compute-0 sudo[158079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfnimzgfhiidrxnkotkufihtgfranbma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361567.3508258-1332-225886443270731/AnsiballZ_stat.py'
Dec 10 10:12:47 compute-0 sudo[158079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:47 compute-0 python3.9[158081]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:47 compute-0 sudo[158079]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:48 compute-0 sudo[158202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mycjcozkzhmunybpcltctpawzlzbuzah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361567.3508258-1332-225886443270731/AnsiballZ_copy.py'
Dec 10 10:12:48 compute-0 sudo[158202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:48 compute-0 python3.9[158204]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361567.3508258-1332-225886443270731/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:48 compute-0 sudo[158202]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:48 compute-0 sudo[158354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kovwjhoewlieejntdhwpjsdnzvucihdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361568.554209-1347-127213684510585/AnsiballZ_stat.py'
Dec 10 10:12:48 compute-0 sudo[158354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:49 compute-0 python3.9[158356]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:12:49 compute-0 sudo[158354]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:49 compute-0 sudo[158477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzahklaishcojuxixxvevmjzlnmpfnsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361568.554209-1347-127213684510585/AnsiballZ_copy.py'
Dec 10 10:12:49 compute-0 sudo[158477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:49 compute-0 python3.9[158479]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361568.554209-1347-127213684510585/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:12:49 compute-0 sudo[158477]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:50 compute-0 sudo[158629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dajjwokcwwygkfzuvdvhqalfuvnthfyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361569.7907307-1362-112352298162189/AnsiballZ_systemd.py'
Dec 10 10:12:50 compute-0 sudo[158629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:50 compute-0 python3.9[158631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:12:50 compute-0 systemd[1]: Reloading.
Dec 10 10:12:50 compute-0 systemd-rc-local-generator[158658]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:50 compute-0 systemd-sysv-generator[158661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:50 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 10 10:12:50 compute-0 sudo[158629]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:51 compute-0 sudo[158819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuiqdheatzbvyvwjsqkhlpbjoxgjslfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361570.90337-1370-178334937768943/AnsiballZ_systemd.py'
Dec 10 10:12:51 compute-0 sudo[158819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:12:51 compute-0 python3.9[158821]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 10 10:12:51 compute-0 systemd[1]: Reloading.
Dec 10 10:12:51 compute-0 systemd-rc-local-generator[158848]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:51 compute-0 systemd-sysv-generator[158851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:51 compute-0 systemd[1]: Reloading.
Dec 10 10:12:51 compute-0 systemd-sysv-generator[158886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:12:51 compute-0 systemd-rc-local-generator[158883]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:12:52 compute-0 sudo[158819]: pam_unix(sudo:session): session closed for user root
Dec 10 10:12:52 compute-0 sshd-session[104450]: Connection closed by 192.168.122.30 port 55526
Dec 10 10:12:52 compute-0 sshd-session[104447]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:12:52 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Dec 10 10:12:52 compute-0 systemd[1]: session-23.scope: Consumed 3min 27.114s CPU time.
Dec 10 10:12:52 compute-0 systemd-logind[787]: Session 23 logged out. Waiting for processes to exit.
Dec 10 10:12:52 compute-0 systemd-logind[787]: Removed session 23.
Dec 10 10:12:57 compute-0 sshd-session[158917]: Accepted publickey for zuul from 192.168.122.30 port 51168 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:12:58 compute-0 systemd-logind[787]: New session 24 of user zuul.
Dec 10 10:12:58 compute-0 systemd[1]: Started Session 24 of User zuul.
Dec 10 10:12:58 compute-0 sshd-session[158917]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:12:59 compute-0 python3.9[159070]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:13:00 compute-0 python3.9[159224]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:13:00 compute-0 network[159241]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:13:00 compute-0 network[159242]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:13:00 compute-0 network[159243]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:13:05 compute-0 podman[159431]: 2025-12-10 10:13:05.016603012 +0000 UTC m=+0.060355805 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:13:05 compute-0 sudo[159531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofynnngtgwieicwsbwonbqvnsptczwyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361584.89445-47-38303947344205/AnsiballZ_setup.py'
Dec 10 10:13:05 compute-0 sudo[159531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:05 compute-0 python3.9[159533]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 10 10:13:05 compute-0 sudo[159531]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:06 compute-0 sudo[159615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyropljevrilqbeavcvkmsaiqiueptba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361584.89445-47-38303947344205/AnsiballZ_dnf.py'
Dec 10 10:13:06 compute-0 sudo[159615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:06 compute-0 python3.9[159617]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:13:11 compute-0 podman[159619]: 2025-12-10 10:13:11.071077038 +0000 UTC m=+0.105781241 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:13:11 compute-0 sudo[159615]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:12 compute-0 sudo[159794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asdggiguisxxadqyobnakkjrltshrjih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361591.9293342-59-53422767054862/AnsiballZ_stat.py'
Dec 10 10:13:12 compute-0 sudo[159794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:12 compute-0 python3.9[159796]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:13:12 compute-0 sudo[159794]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:13 compute-0 sudo[159946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alczmdtrtxzdslgsjpdjaypgwxjoizzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361592.8470535-69-133227143534838/AnsiballZ_command.py'
Dec 10 10:13:13 compute-0 sudo[159946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:13 compute-0 python3.9[159948]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:13:13 compute-0 sudo[159946]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:14 compute-0 sudo[160099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxdsmupxzklxskycqdkweemybosjibhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361593.8024201-79-182489662988883/AnsiballZ_stat.py'
Dec 10 10:13:14 compute-0 sudo[160099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:14 compute-0 python3.9[160101]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:13:14 compute-0 sudo[160099]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:14 compute-0 sudo[160251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtvgafdycdgohkdbhkilougwjtclokdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361594.4492338-87-72443125271790/AnsiballZ_command.py'
Dec 10 10:13:14 compute-0 sudo[160251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:14 compute-0 python3.9[160253]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:13:14 compute-0 sudo[160251]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:15 compute-0 sudo[160404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkuokyvknxayfjcxrnwctdcqzvfnktk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361595.1413274-95-97503846999891/AnsiballZ_stat.py'
Dec 10 10:13:15 compute-0 sudo[160404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:15 compute-0 python3.9[160406]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:15 compute-0 sudo[160404]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:16 compute-0 sudo[160527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hknveezhqiozyrggqiyorlkjllhxrixm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361595.1413274-95-97503846999891/AnsiballZ_copy.py'
Dec 10 10:13:16 compute-0 sudo[160527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:16 compute-0 python3.9[160529]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361595.1413274-95-97503846999891/.source.iscsi _original_basename=.aztmwta1 follow=False checksum=7050384c491105bc5089220bb474d27cb44a4260 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:16 compute-0 sudo[160527]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:17 compute-0 sudo[160679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqdulqfgzmetocxxtoegkcdqehhwvevp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361596.6010458-110-18879593557750/AnsiballZ_file.py'
Dec 10 10:13:17 compute-0 sudo[160679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:17 compute-0 python3.9[160681]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:17 compute-0 sudo[160679]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:17 compute-0 sudo[160831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvsndwigmeuguritrqvmlxdyuorsyuks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361597.4276955-118-138527317770577/AnsiballZ_lineinfile.py'
Dec 10 10:13:17 compute-0 sudo[160831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:18 compute-0 python3.9[160833]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:18 compute-0 sudo[160831]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:18 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:13:18 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:13:18 compute-0 sudo[160984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkvckxzppzrzpbahvobntnybnbemuye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361598.3328235-127-41373707762176/AnsiballZ_systemd_service.py'
Dec 10 10:13:18 compute-0 sudo[160984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:19 compute-0 python3.9[160986]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:13:20 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 10 10:13:20 compute-0 sudo[160984]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:20 compute-0 sudo[161140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njxnddmblaxwstexubucwxgpcuaztvco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361600.5001166-135-207871849588775/AnsiballZ_systemd_service.py'
Dec 10 10:13:20 compute-0 sudo[161140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:21 compute-0 python3.9[161142]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:13:21 compute-0 systemd[1]: Reloading.
Dec 10 10:13:21 compute-0 systemd-rc-local-generator[161167]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:13:21 compute-0 systemd-sysv-generator[161173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:13:21 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 10 10:13:21 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 10 10:13:21 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 10 10:13:21 compute-0 systemd[1]: Started Open-iSCSI.
Dec 10 10:13:21 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 10 10:13:21 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 10 10:13:21 compute-0 sudo[161140]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:22 compute-0 sudo[161340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crijyxlievkarvjvgkkwtyxqebepkwfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361601.8493435-146-1569877654468/AnsiballZ_service_facts.py'
Dec 10 10:13:22 compute-0 sudo[161340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:22 compute-0 python3.9[161342]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:13:22 compute-0 network[161359]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:13:22 compute-0 network[161360]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:13:22 compute-0 network[161361]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:13:26 compute-0 sudo[161340]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:27 compute-0 sudo[161630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekkrdjjkeqcdazxonblmbfhpdufixbxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361607.209308-156-187127703633988/AnsiballZ_file.py'
Dec 10 10:13:27 compute-0 sudo[161630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:27 compute-0 python3.9[161632]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 10 10:13:27 compute-0 sudo[161630]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:28 compute-0 sudo[161782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcipmdbdzdkddmnwfqnexaoioeqktxvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361607.9493268-164-93888996106152/AnsiballZ_modprobe.py'
Dec 10 10:13:28 compute-0 sudo[161782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:28 compute-0 python3.9[161784]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 10 10:13:28 compute-0 sudo[161782]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:29 compute-0 sudo[161938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldwgveqtosxuaisvofgmxoxdgfsqaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361608.7749014-172-165890844998396/AnsiballZ_stat.py'
Dec 10 10:13:29 compute-0 sudo[161938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:29 compute-0 python3.9[161940]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:29 compute-0 sudo[161938]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:29 compute-0 sudo[162061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjcxhcjzimivivigjmxdaogvnrzykmrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361608.7749014-172-165890844998396/AnsiballZ_copy.py'
Dec 10 10:13:29 compute-0 sudo[162061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:29 compute-0 python3.9[162063]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361608.7749014-172-165890844998396/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:29 compute-0 sudo[162061]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:30 compute-0 sudo[162213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipxggesbqegffcaqbecbvjqqpljxksxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361610.0755367-188-34160901992068/AnsiballZ_lineinfile.py'
Dec 10 10:13:30 compute-0 sudo[162213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:30 compute-0 python3.9[162215]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:30 compute-0 sudo[162213]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:13:31.453 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:13:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:13:31.454 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:13:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:13:31.454 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:13:31 compute-0 sudo[162366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekbazroaqhhqxistqbbedoglwhmysllh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361610.7944314-196-159119151878616/AnsiballZ_systemd.py'
Dec 10 10:13:31 compute-0 sudo[162366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:31 compute-0 python3.9[162368]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:13:31 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 10 10:13:31 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 10 10:13:31 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 10 10:13:31 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 10 10:13:31 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 10 10:13:31 compute-0 sudo[162366]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:32 compute-0 sudo[162522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiqgrsmgwkvgsujpbgryyxtwvwxjmqbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361612.0220554-204-137587024882153/AnsiballZ_file.py'
Dec 10 10:13:32 compute-0 sudo[162522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:32 compute-0 python3.9[162524]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:32 compute-0 sudo[162522]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:33 compute-0 sudo[162674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndnuswjrxzvgmsqvhbufpaoqrbjsutjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361612.8118653-213-180627700653278/AnsiballZ_stat.py'
Dec 10 10:13:33 compute-0 sudo[162674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:33 compute-0 python3.9[162676]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:13:33 compute-0 sudo[162674]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:33 compute-0 sudo[162828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xplkmmevoncjilwltakqhrqljlzsapll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361613.5534444-222-96262671840444/AnsiballZ_stat.py'
Dec 10 10:13:33 compute-0 sudo[162828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:34 compute-0 python3.9[162830]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:13:34 compute-0 sudo[162828]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:34 compute-0 sudo[162980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwzrkgwcwcxiaqptpdkgqqniziomsfsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361614.282589-230-58411449541120/AnsiballZ_stat.py'
Dec 10 10:13:34 compute-0 sudo[162980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:34 compute-0 python3.9[162982]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:34 compute-0 sudo[162980]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:35 compute-0 sudo[163114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxkxxfghcbaovtdoidavqkdjaurysuiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361614.282589-230-58411449541120/AnsiballZ_copy.py'
Dec 10 10:13:35 compute-0 sudo[163114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:35 compute-0 podman[163077]: 2025-12-10 10:13:35.251349172 +0000 UTC m=+0.069103346 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Dec 10 10:13:35 compute-0 python3.9[163118]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361614.282589-230-58411449541120/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:35 compute-0 sudo[163114]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:35 compute-0 sudo[163272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msumgjxlppevrcypedcfepskiksibldc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361615.657662-245-181262708192129/AnsiballZ_command.py'
Dec 10 10:13:35 compute-0 sudo[163272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:36 compute-0 python3.9[163274]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:13:36 compute-0 sudo[163272]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:36 compute-0 sudo[163425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkjcgmjvegaheifgkdmojmfgwohprbtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361616.2704139-253-219141996310695/AnsiballZ_lineinfile.py'
Dec 10 10:13:36 compute-0 sudo[163425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:36 compute-0 python3.9[163427]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:36 compute-0 sudo[163425]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:37 compute-0 sudo[163577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usepdvxttqvvunulkhapoufgwejpnzmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361616.9038625-261-105391010612449/AnsiballZ_replace.py'
Dec 10 10:13:37 compute-0 sudo[163577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:37 compute-0 python3.9[163579]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:37 compute-0 sudo[163577]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:38 compute-0 sudo[163729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoxfjtlifqfnnedkyqghwxulmfjebvej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361617.7323298-269-119608830420749/AnsiballZ_replace.py'
Dec 10 10:13:38 compute-0 sudo[163729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:38 compute-0 python3.9[163731]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:38 compute-0 sudo[163729]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:38 compute-0 sudo[163881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncfvgxjljwvmxeknxmzihyijbpjchqak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361618.433868-278-126684870247822/AnsiballZ_lineinfile.py'
Dec 10 10:13:38 compute-0 sudo[163881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:38 compute-0 python3.9[163883]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:38 compute-0 sudo[163881]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:39 compute-0 sudo[164035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpowzqqwueeimzdcilsivhzqkniuitxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361619.1307023-278-84411313351945/AnsiballZ_lineinfile.py'
Dec 10 10:13:39 compute-0 sudo[164035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:39 compute-0 sshd-session[163884]: Received disconnect from 193.46.255.244 port 49028:11:  [preauth]
Dec 10 10:13:39 compute-0 sshd-session[163884]: Disconnected from authenticating user root 193.46.255.244 port 49028 [preauth]
Dec 10 10:13:39 compute-0 python3.9[164037]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:39 compute-0 sudo[164035]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:40 compute-0 sudo[164187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pslcrijkgruqggzkyorxwzvcaqdjvavu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361619.8743267-278-36363970100308/AnsiballZ_lineinfile.py'
Dec 10 10:13:40 compute-0 sudo[164187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:40 compute-0 python3.9[164189]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:40 compute-0 sudo[164187]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:40 compute-0 sudo[164339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isqpfwfnqkmpjlsjtgphiscqawutqyxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361620.5492866-278-197448533057718/AnsiballZ_lineinfile.py'
Dec 10 10:13:40 compute-0 sudo[164339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:41 compute-0 python3.9[164341]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:41 compute-0 sudo[164339]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:41 compute-0 sudo[164507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvgvtctuizvvgtiuoiemqbtpjocupcqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361621.231542-307-108418360692681/AnsiballZ_stat.py'
Dec 10 10:13:41 compute-0 sudo[164507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:41 compute-0 podman[164465]: 2025-12-10 10:13:41.585088369 +0000 UTC m=+0.092018616 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:13:41 compute-0 python3.9[164513]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:13:41 compute-0 sudo[164507]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:42 compute-0 sudo[164673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfwgcmnzpynuxcshickwuvvqlzrznyhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361622.320868-315-274754950828266/AnsiballZ_file.py'
Dec 10 10:13:42 compute-0 sudo[164673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:42 compute-0 python3.9[164675]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:42 compute-0 sudo[164673]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:43 compute-0 sudo[164825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrcbpoixtdyjsxtxvpqtuqsnmzsslfkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361623.007541-324-231216702525621/AnsiballZ_file.py'
Dec 10 10:13:43 compute-0 sudo[164825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:43 compute-0 python3.9[164827]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:43 compute-0 sudo[164825]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:43 compute-0 sudo[164977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzjflsqnfvizhogsybmntnlxpunepvsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361623.6250412-332-183994658917142/AnsiballZ_stat.py'
Dec 10 10:13:43 compute-0 sudo[164977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:44 compute-0 python3.9[164979]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:44 compute-0 sudo[164977]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:44 compute-0 sudo[165055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oybirbfijxnwqzzqjrdwzzjbazxtbfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361623.6250412-332-183994658917142/AnsiballZ_file.py'
Dec 10 10:13:44 compute-0 sudo[165055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:44 compute-0 python3.9[165057]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:44 compute-0 sudo[165055]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:45 compute-0 sudo[165207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xypyvbwsbumaxjhqbysyiowpuzchniyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361624.6784039-332-171058892494863/AnsiballZ_stat.py'
Dec 10 10:13:45 compute-0 sudo[165207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:45 compute-0 python3.9[165209]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:45 compute-0 sudo[165207]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:45 compute-0 sudo[165285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roywoxrhxbxfwmsyjhsrsjikyuxcjnmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361624.6784039-332-171058892494863/AnsiballZ_file.py'
Dec 10 10:13:45 compute-0 sudo[165285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:45 compute-0 python3.9[165287]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:45 compute-0 sudo[165285]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:46 compute-0 sudo[165437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfnvkzwhmxklsqrnmqtvvmhelqvxlvdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361626.0235622-355-68422409798867/AnsiballZ_file.py'
Dec 10 10:13:46 compute-0 sudo[165437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:46 compute-0 python3.9[165439]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:46 compute-0 sshd-session[162677]: Invalid user NL5xUDpV2xRa from 195.88.120.62 port 46228
Dec 10 10:13:46 compute-0 sshd-session[162677]: fatal: userauth_pubkey: parse publickey packet: incomplete message [preauth]
Dec 10 10:13:46 compute-0 sudo[165437]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:46 compute-0 sudo[165589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aycpabgweeignlekmrmtiijpmomsqgba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361626.7200973-363-123143527144281/AnsiballZ_stat.py'
Dec 10 10:13:46 compute-0 sudo[165589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:47 compute-0 python3.9[165591]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:47 compute-0 sudo[165589]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:47 compute-0 sudo[165667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvrnrkajrvjzpqnzfczagkjnuqvgywha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361626.7200973-363-123143527144281/AnsiballZ_file.py'
Dec 10 10:13:47 compute-0 sudo[165667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:47 compute-0 python3.9[165669]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:47 compute-0 sudo[165667]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:48 compute-0 sudo[165819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imheibcibouvfzdylvkcsbqtjtvqvvkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361627.8162487-375-165592302267372/AnsiballZ_stat.py'
Dec 10 10:13:48 compute-0 sudo[165819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:48 compute-0 python3.9[165821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:48 compute-0 sudo[165819]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:48 compute-0 sudo[165897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvfinkvkcqdxoslpcxybuzbucucohzxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361627.8162487-375-165592302267372/AnsiballZ_file.py'
Dec 10 10:13:48 compute-0 sudo[165897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:48 compute-0 python3.9[165899]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:48 compute-0 sudo[165897]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:49 compute-0 sudo[166049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ummxtaqutzdfksqvcgvekafzuecgvzgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361628.9451354-387-112050528368590/AnsiballZ_systemd.py'
Dec 10 10:13:49 compute-0 sudo[166049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:49 compute-0 python3.9[166051]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:13:49 compute-0 systemd[1]: Reloading.
Dec 10 10:13:49 compute-0 systemd-rc-local-generator[166079]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:13:49 compute-0 systemd-sysv-generator[166083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:13:49 compute-0 sudo[166049]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:50 compute-0 sudo[166239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdywznvwefofgabzieqjdviwhwxnubfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361630.289696-395-53189167025844/AnsiballZ_stat.py'
Dec 10 10:13:50 compute-0 sudo[166239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:50 compute-0 python3.9[166241]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:50 compute-0 sudo[166239]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:51 compute-0 sudo[166317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyiuhpxzccmcbllgngcfkrebvfeazlgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361630.289696-395-53189167025844/AnsiballZ_file.py'
Dec 10 10:13:51 compute-0 sudo[166317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:51 compute-0 python3.9[166319]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:51 compute-0 sudo[166317]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:51 compute-0 sudo[166469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzzhqyqeupsmzulzyojfmihkdshruith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361631.4882267-407-205725274567230/AnsiballZ_stat.py'
Dec 10 10:13:51 compute-0 sudo[166469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:51 compute-0 python3.9[166471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:52 compute-0 sudo[166469]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:52 compute-0 sudo[166547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fppwffcovprawqkayrdunwoowqroazle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361631.4882267-407-205725274567230/AnsiballZ_file.py'
Dec 10 10:13:52 compute-0 sudo[166547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:52 compute-0 python3.9[166549]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:52 compute-0 sudo[166547]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:52 compute-0 sudo[166699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uannkvthfxjoergvhewevvzdwnvsrzjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361632.61806-419-73184951380712/AnsiballZ_systemd.py'
Dec 10 10:13:52 compute-0 sudo[166699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:53 compute-0 python3.9[166701]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:13:53 compute-0 systemd[1]: Reloading.
Dec 10 10:13:53 compute-0 systemd-rc-local-generator[166730]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:13:53 compute-0 systemd-sysv-generator[166733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:13:53 compute-0 systemd[1]: Starting Create netns directory...
Dec 10 10:13:53 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 10 10:13:53 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 10 10:13:53 compute-0 systemd[1]: Finished Create netns directory.
Dec 10 10:13:53 compute-0 sudo[166699]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:54 compute-0 sudo[166892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdscgkpwtnmzfnzmzrfxamlkdvwddduc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361633.892948-429-246471136764692/AnsiballZ_file.py'
Dec 10 10:13:54 compute-0 sudo[166892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:54 compute-0 python3.9[166894]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:54 compute-0 sudo[166892]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:54 compute-0 sudo[167044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmuodyfpvzsuyuledtuflbkvvijdeum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361634.529597-437-190075512833902/AnsiballZ_stat.py'
Dec 10 10:13:54 compute-0 sudo[167044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:55 compute-0 python3.9[167046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:55 compute-0 sudo[167044]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:55 compute-0 sudo[167167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnxjucixwbotqidtdfhyntiwyhdkegaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361634.529597-437-190075512833902/AnsiballZ_copy.py'
Dec 10 10:13:55 compute-0 sudo[167167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:55 compute-0 python3.9[167169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361634.529597-437-190075512833902/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:55 compute-0 sudo[167167]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:56 compute-0 sudo[167319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqtkdikohpbaktjglgxcahmxiteezasv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361636.082668-454-187757147245549/AnsiballZ_file.py'
Dec 10 10:13:56 compute-0 sudo[167319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:56 compute-0 python3.9[167321]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:13:56 compute-0 sudo[167319]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:57 compute-0 sudo[167471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdxtdhvtsvzhttwepkdocfrlbvmlqvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361636.8455455-462-202135356276284/AnsiballZ_stat.py'
Dec 10 10:13:57 compute-0 sudo[167471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:57 compute-0 python3.9[167473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:13:57 compute-0 sudo[167471]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:57 compute-0 sudo[167594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daupjihwaaeiurujsjwvfnghxpxxcbwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361636.8455455-462-202135356276284/AnsiballZ_copy.py'
Dec 10 10:13:57 compute-0 sudo[167594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:57 compute-0 python3.9[167596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361636.8455455-462-202135356276284/.source.json _original_basename=.6x4v4mg_ follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:57 compute-0 sudo[167594]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:58 compute-0 sudo[167746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdyyewxymhkfeudklvsqdxbrlyaipvgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361638.0461307-477-219932060728691/AnsiballZ_file.py'
Dec 10 10:13:58 compute-0 sudo[167746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:58 compute-0 python3.9[167748]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:13:58 compute-0 sudo[167746]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:59 compute-0 sudo[167898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awslgtqonsrqjopspjyopycqfyoymhup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361638.719742-485-87979741636899/AnsiballZ_stat.py'
Dec 10 10:13:59 compute-0 sudo[167898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:59 compute-0 sudo[167898]: pam_unix(sudo:session): session closed for user root
Dec 10 10:13:59 compute-0 sudo[168021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tirqlhplovncqozbzsxprqrrbtwfliwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361638.719742-485-87979741636899/AnsiballZ_copy.py'
Dec 10 10:13:59 compute-0 sudo[168021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:13:59 compute-0 sudo[168021]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:00 compute-0 sudo[168173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilakkkwxntdriejkkrxtchncoyfhdsyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361640.1716485-502-202294091775548/AnsiballZ_container_config_data.py'
Dec 10 10:14:00 compute-0 sudo[168173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:00 compute-0 python3.9[168175]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 10 10:14:00 compute-0 sudo[168173]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:01 compute-0 sudo[168325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsbmrymvaefbbjvhflkmviadfopzpgff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361641.05207-511-276129602601814/AnsiballZ_container_config_hash.py'
Dec 10 10:14:01 compute-0 sudo[168325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:01 compute-0 python3.9[168327]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:14:01 compute-0 sudo[168325]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:02 compute-0 sudo[168478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fizrirfpbyuqbjgbrvgxuerwcqhmwybz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361642.161183-520-36820961884432/AnsiballZ_podman_container_info.py'
Dec 10 10:14:02 compute-0 sudo[168478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:02 compute-0 python3.9[168480]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 10 10:14:02 compute-0 sudo[168478]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:03 compute-0 sudo[168656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbcdabsozddgecngttmaoyhukjdgsyyz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361643.4191098-533-165047548144373/AnsiballZ_edpm_container_manage.py'
Dec 10 10:14:03 compute-0 sudo[168656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:04 compute-0 python3[168658]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:14:04 compute-0 podman[168691]: 2025-12-10 10:14:04.362842917 +0000 UTC m=+0.049872973 container create 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 10 10:14:04 compute-0 podman[168691]: 2025-12-10 10:14:04.339168157 +0000 UTC m=+0.026198273 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 10 10:14:04 compute-0 python3[168658]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 10 10:14:04 compute-0 sudo[168656]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:04 compute-0 sudo[168876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbnxsyormdyfocndzxpyxcqmyuickkpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361644.6888852-541-188398263478214/AnsiballZ_stat.py'
Dec 10 10:14:04 compute-0 sudo[168876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:05 compute-0 python3.9[168878]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:14:05 compute-0 sudo[168876]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:05 compute-0 sudo[169040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztpsloouulvyrtigsvuwahohomhrfqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361645.4463453-550-27557597674054/AnsiballZ_file.py'
Dec 10 10:14:05 compute-0 sudo[169040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:05 compute-0 podman[169004]: 2025-12-10 10:14:05.781154391 +0000 UTC m=+0.071075161 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Dec 10 10:14:05 compute-0 python3.9[169046]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:05 compute-0 sudo[169040]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:06 compute-0 sudo[169123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhotpxrznotdqfwsvhmuvwefeetjagyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361645.4463453-550-27557597674054/AnsiballZ_stat.py'
Dec 10 10:14:06 compute-0 sudo[169123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:06 compute-0 python3.9[169125]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:14:06 compute-0 sudo[169123]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:06 compute-0 sudo[169274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkfbzaezckfoqtovjgmyusaasxlmedle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361646.4028764-550-167087807824810/AnsiballZ_copy.py'
Dec 10 10:14:06 compute-0 sudo[169274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:07 compute-0 python3.9[169276]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361646.4028764-550-167087807824810/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:07 compute-0 sudo[169274]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:07 compute-0 sudo[169350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfkljufkcgojzqpcmdudngvkwvdznhys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361646.4028764-550-167087807824810/AnsiballZ_systemd.py'
Dec 10 10:14:07 compute-0 sudo[169350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:07 compute-0 python3.9[169352]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:14:07 compute-0 systemd[1]: Reloading.
Dec 10 10:14:07 compute-0 systemd-rc-local-generator[169377]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:07 compute-0 systemd-sysv-generator[169380]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:07 compute-0 sudo[169350]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:08 compute-0 sudo[169461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiwluerwvciacabsyyzivzlcydiwwekb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361646.4028764-550-167087807824810/AnsiballZ_systemd.py'
Dec 10 10:14:08 compute-0 sudo[169461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:08 compute-0 python3.9[169463]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:08 compute-0 systemd[1]: Reloading.
Dec 10 10:14:08 compute-0 systemd-rc-local-generator[169492]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:08 compute-0 systemd-sysv-generator[169495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:08 compute-0 systemd[1]: Starting multipathd container...
Dec 10 10:14:08 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/816765d037c4567c2d5e051a50ea1cfa02c59c57a1b18a074ac7b3d1d921ab30/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 10 10:14:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/816765d037c4567c2d5e051a50ea1cfa02c59c57a1b18a074ac7b3d1d921ab30/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 10 10:14:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.
Dec 10 10:14:09 compute-0 podman[169502]: 2025-12-10 10:14:09.033587022 +0000 UTC m=+0.166705231 container init 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec 10 10:14:09 compute-0 multipathd[169517]: + sudo -E kolla_set_configs
Dec 10 10:14:09 compute-0 podman[169502]: 2025-12-10 10:14:09.06644241 +0000 UTC m=+0.199560569 container start 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 10 10:14:09 compute-0 sudo[169523]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 10 10:14:09 compute-0 podman[169502]: multipathd
Dec 10 10:14:09 compute-0 sudo[169523]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:14:09 compute-0 sudo[169523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 10 10:14:09 compute-0 systemd[1]: Started multipathd container.
Dec 10 10:14:09 compute-0 sudo[169461]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:09 compute-0 multipathd[169517]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:14:09 compute-0 multipathd[169517]: INFO:__main__:Validating config file
Dec 10 10:14:09 compute-0 multipathd[169517]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:14:09 compute-0 multipathd[169517]: INFO:__main__:Writing out command to execute
Dec 10 10:14:09 compute-0 sudo[169523]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:09 compute-0 podman[169524]: 2025-12-10 10:14:09.153704083 +0000 UTC m=+0.077057829 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:14:09 compute-0 multipathd[169517]: ++ cat /run_command
Dec 10 10:14:09 compute-0 systemd[1]: 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-125b79ed160d66df.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:14:09 compute-0 systemd[1]: 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-125b79ed160d66df.service: Failed with result 'exit-code'.
Dec 10 10:14:09 compute-0 multipathd[169517]: + CMD='/usr/sbin/multipathd -d'
Dec 10 10:14:09 compute-0 multipathd[169517]: + ARGS=
Dec 10 10:14:09 compute-0 multipathd[169517]: + sudo kolla_copy_cacerts
Dec 10 10:14:09 compute-0 sudo[169548]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 10 10:14:09 compute-0 sudo[169548]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:14:09 compute-0 sudo[169548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 10 10:14:09 compute-0 sudo[169548]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:09 compute-0 multipathd[169517]: + [[ ! -n '' ]]
Dec 10 10:14:09 compute-0 multipathd[169517]: + . kolla_extend_start
Dec 10 10:14:09 compute-0 multipathd[169517]: Running command: '/usr/sbin/multipathd -d'
Dec 10 10:14:09 compute-0 multipathd[169517]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 10 10:14:09 compute-0 multipathd[169517]: + umask 0022
Dec 10 10:14:09 compute-0 multipathd[169517]: + exec /usr/sbin/multipathd -d
Dec 10 10:14:09 compute-0 multipathd[169517]: 2664.475173 | --------start up--------
Dec 10 10:14:09 compute-0 multipathd[169517]: 2664.475191 | read /etc/multipath.conf
Dec 10 10:14:09 compute-0 multipathd[169517]: 2664.480494 | path checkers start up
Dec 10 10:14:09 compute-0 python3.9[169704]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:14:10 compute-0 sudo[169856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcexvriyggkhivuwpsnwqnbxdhejcjyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361649.990819-586-101057060964088/AnsiballZ_command.py'
Dec 10 10:14:10 compute-0 sudo[169856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:10 compute-0 python3.9[169858]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:10 compute-0 sudo[169856]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:11 compute-0 sudo[170021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pduurazklibcafzaiqkcgaopwxikzdjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361650.9869905-594-193914320516241/AnsiballZ_systemd.py'
Dec 10 10:14:11 compute-0 sudo[170021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:11 compute-0 python3.9[170023]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:14:11 compute-0 systemd[1]: Stopping multipathd container...
Dec 10 10:14:11 compute-0 multipathd[169517]: 2666.972653 | exit (signal)
Dec 10 10:14:11 compute-0 multipathd[169517]: 2666.972763 | --------shut down-------
Dec 10 10:14:11 compute-0 systemd[1]: libpod-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope: Deactivated successfully.
Dec 10 10:14:11 compute-0 conmon[169517]: conmon 16d1756dd065e2f28b29 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope/container/memory.events
Dec 10 10:14:11 compute-0 podman[170028]: 2025-12-10 10:14:11.751940636 +0000 UTC m=+0.104471912 container died 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:14:11 compute-0 systemd[1]: 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-125b79ed160d66df.timer: Deactivated successfully.
Dec 10 10:14:11 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.
Dec 10 10:14:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-userdata-shm.mount: Deactivated successfully.
Dec 10 10:14:11 compute-0 podman[170027]: 2025-12-10 10:14:11.793297306 +0000 UTC m=+0.148086259 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 10 10:14:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-816765d037c4567c2d5e051a50ea1cfa02c59c57a1b18a074ac7b3d1d921ab30-merged.mount: Deactivated successfully.
Dec 10 10:14:11 compute-0 podman[170028]: 2025-12-10 10:14:11.825305593 +0000 UTC m=+0.177836879 container cleanup 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:14:11 compute-0 podman[170028]: multipathd
Dec 10 10:14:11 compute-0 podman[170078]: multipathd
Dec 10 10:14:11 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 10 10:14:11 compute-0 systemd[1]: Stopped multipathd container.
Dec 10 10:14:11 compute-0 systemd[1]: Starting multipathd container...
Dec 10 10:14:12 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/816765d037c4567c2d5e051a50ea1cfa02c59c57a1b18a074ac7b3d1d921ab30/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 10 10:14:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/816765d037c4567c2d5e051a50ea1cfa02c59c57a1b18a074ac7b3d1d921ab30/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 10 10:14:12 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.
Dec 10 10:14:12 compute-0 podman[170091]: 2025-12-10 10:14:12.055847802 +0000 UTC m=+0.136772055 container init 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3)
Dec 10 10:14:12 compute-0 multipathd[170107]: + sudo -E kolla_set_configs
Dec 10 10:14:12 compute-0 sudo[170113]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 10 10:14:12 compute-0 sudo[170113]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:14:12 compute-0 sudo[170113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 10 10:14:12 compute-0 podman[170091]: 2025-12-10 10:14:12.101463539 +0000 UTC m=+0.182387812 container start 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:14:12 compute-0 podman[170091]: multipathd
Dec 10 10:14:12 compute-0 systemd[1]: Started multipathd container.
Dec 10 10:14:12 compute-0 multipathd[170107]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:14:12 compute-0 multipathd[170107]: INFO:__main__:Validating config file
Dec 10 10:14:12 compute-0 multipathd[170107]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:14:12 compute-0 multipathd[170107]: INFO:__main__:Writing out command to execute
Dec 10 10:14:12 compute-0 sudo[170021]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:12 compute-0 sudo[170113]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:12 compute-0 multipathd[170107]: ++ cat /run_command
Dec 10 10:14:12 compute-0 multipathd[170107]: + CMD='/usr/sbin/multipathd -d'
Dec 10 10:14:12 compute-0 multipathd[170107]: + ARGS=
Dec 10 10:14:12 compute-0 multipathd[170107]: + sudo kolla_copy_cacerts
Dec 10 10:14:12 compute-0 podman[170114]: 2025-12-10 10:14:12.172397585 +0000 UTC m=+0.061455611 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:14:12 compute-0 sudo[170138]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 10 10:14:12 compute-0 sudo[170138]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:14:12 compute-0 sudo[170138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 10 10:14:12 compute-0 systemd[1]: 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-5c107f9c374f96b7.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:14:12 compute-0 systemd[1]: 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a-5c107f9c374f96b7.service: Failed with result 'exit-code'.
Dec 10 10:14:12 compute-0 sudo[170138]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:12 compute-0 multipathd[170107]: + [[ ! -n '' ]]
Dec 10 10:14:12 compute-0 multipathd[170107]: + . kolla_extend_start
Dec 10 10:14:12 compute-0 multipathd[170107]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 10 10:14:12 compute-0 multipathd[170107]: Running command: '/usr/sbin/multipathd -d'
Dec 10 10:14:12 compute-0 multipathd[170107]: + umask 0022
Dec 10 10:14:12 compute-0 multipathd[170107]: + exec /usr/sbin/multipathd -d
Dec 10 10:14:12 compute-0 multipathd[170107]: 2667.465704 | --------start up--------
Dec 10 10:14:12 compute-0 multipathd[170107]: 2667.465719 | read /etc/multipath.conf
Dec 10 10:14:12 compute-0 multipathd[170107]: 2667.470687 | path checkers start up
Dec 10 10:14:12 compute-0 sudo[170298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxvzxnjtcaqepgzxcmvbgqxkqikhaqgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361652.3388495-602-101989576884346/AnsiballZ_file.py'
Dec 10 10:14:12 compute-0 sudo[170298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:12 compute-0 python3.9[170300]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:12 compute-0 sudo[170298]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:13 compute-0 sudo[170450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmgwizfejfyqzwpieyywirhpmxwhonzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361653.193664-614-45719030714024/AnsiballZ_file.py'
Dec 10 10:14:13 compute-0 sudo[170450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:13 compute-0 python3.9[170452]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 10 10:14:13 compute-0 sudo[170450]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:14 compute-0 sudo[170602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjmxxmykdihacjqamvdsxhtxfiblytzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361653.8769498-622-32973754032776/AnsiballZ_modprobe.py'
Dec 10 10:14:14 compute-0 sudo[170602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:14 compute-0 python3.9[170604]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 10 10:14:14 compute-0 kernel: Key type psk registered
Dec 10 10:14:14 compute-0 sudo[170602]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:14 compute-0 sudo[170765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thjcwyjmtneystammjsteiedkdzakuqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361654.6239395-630-224485709712685/AnsiballZ_stat.py'
Dec 10 10:14:14 compute-0 sudo[170765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:15 compute-0 python3.9[170767]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:14:15 compute-0 sudo[170765]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:15 compute-0 sudo[170888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvgucvdgcxzmarljybzqiuchqnwfvjlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361654.6239395-630-224485709712685/AnsiballZ_copy.py'
Dec 10 10:14:15 compute-0 sudo[170888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:15 compute-0 python3.9[170890]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361654.6239395-630-224485709712685/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:15 compute-0 sudo[170888]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:16 compute-0 sudo[171040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsuqxvlxjnggczprxvlqhnupylshuzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361655.9300723-646-21704969026615/AnsiballZ_lineinfile.py'
Dec 10 10:14:16 compute-0 sudo[171040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:16 compute-0 python3.9[171042]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:16 compute-0 sudo[171040]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:16 compute-0 sudo[171192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubxkijqexnmzeldjfppikuxagoetaylp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361656.5775645-654-94687025013841/AnsiballZ_systemd.py'
Dec 10 10:14:16 compute-0 sudo[171192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:17 compute-0 python3.9[171194]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:14:17 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 10 10:14:17 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 10 10:14:17 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 10 10:14:17 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 10 10:14:17 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 10 10:14:17 compute-0 sudo[171192]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:17 compute-0 sudo[171348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuwjyjfqhnyyaaxvmyfkdskvflvtbdvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361657.4784465-662-97439961772432/AnsiballZ_dnf.py'
Dec 10 10:14:17 compute-0 sudo[171348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:18 compute-0 python3.9[171350]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 10 10:14:20 compute-0 systemd[1]: Reloading.
Dec 10 10:14:20 compute-0 systemd-sysv-generator[171384]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:20 compute-0 systemd-rc-local-generator[171380]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:21 compute-0 systemd[1]: Reloading.
Dec 10 10:14:21 compute-0 systemd-rc-local-generator[171418]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:21 compute-0 systemd-sysv-generator[171421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:21 compute-0 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 10 10:14:21 compute-0 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 10 10:14:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 10 10:14:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 10 10:14:21 compute-0 systemd[1]: Reloading.
Dec 10 10:14:21 compute-0 systemd-rc-local-generator[171511]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:21 compute-0 systemd-sysv-generator[171516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:22 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 10 10:14:22 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 10 10:14:22 compute-0 sudo[171348]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:23 compute-0 sudo[172814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drvbfcehrndsjawsqilwmkmsjecvmmxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361662.9679902-670-154729458793740/AnsiballZ_systemd_service.py'
Dec 10 10:14:23 compute-0 sudo[172814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:23 compute-0 python3.9[172816]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:14:23 compute-0 iscsid[161182]: iscsid shutting down.
Dec 10 10:14:23 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 10 10:14:23 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 10 10:14:23 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 10 10:14:23 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 10 10:14:23 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 10 10:14:23 compute-0 systemd[1]: Started Open-iSCSI.
Dec 10 10:14:23 compute-0 sudo[172814]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:23 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 10 10:14:23 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 10 10:14:23 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.590s CPU time.
Dec 10 10:14:23 compute-0 systemd[1]: run-r639ed0a11689448c8ad3d1f064730948.service: Deactivated successfully.
Dec 10 10:14:23 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 10 10:14:24 compute-0 python3.9[172974]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:14:24 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 10 10:14:25 compute-0 sudo[173129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhrxwogtmlufmewuufizyseokesqyixz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361665.0780084-688-53290521140183/AnsiballZ_file.py'
Dec 10 10:14:25 compute-0 sudo[173129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:25 compute-0 python3.9[173131]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:25 compute-0 sudo[173129]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:26 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 10 10:14:26 compute-0 sudo[173282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnjjyyqrqphacmjajzwpghfrbzzgnfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361666.0169928-699-90924187005577/AnsiballZ_systemd_service.py'
Dec 10 10:14:26 compute-0 sudo[173282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:26 compute-0 python3.9[173284]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:14:26 compute-0 systemd[1]: Reloading.
Dec 10 10:14:26 compute-0 systemd-rc-local-generator[173305]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:26 compute-0 systemd-sysv-generator[173312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:26 compute-0 sudo[173282]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:27 compute-0 python3.9[173469]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:14:27 compute-0 network[173486]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:14:27 compute-0 network[173487]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:14:27 compute-0 network[173488]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:14:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:14:31.454 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:14:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:14:31.456 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:14:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:14:31.456 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:14:32 compute-0 sudo[173760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykirtbzdvimqzrmyagfvwsuxdwappbju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361672.2189867-718-1597067867915/AnsiballZ_systemd_service.py'
Dec 10 10:14:32 compute-0 sudo[173760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:32 compute-0 python3.9[173762]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:32 compute-0 sudo[173760]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:33 compute-0 sudo[173913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqjgjsbsdkeprmmlpmmagopqrexoceyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361673.0344336-718-33880196700979/AnsiballZ_systemd_service.py'
Dec 10 10:14:33 compute-0 sudo[173913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:33 compute-0 python3.9[173915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:33 compute-0 sudo[173913]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:34 compute-0 sudo[174066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibnofmxsdtffaenchmvzttnxqmqpsdmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361673.874307-718-187166112260238/AnsiballZ_systemd_service.py'
Dec 10 10:14:34 compute-0 sudo[174066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:34 compute-0 python3.9[174068]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:34 compute-0 sudo[174066]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:34 compute-0 sudo[174219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjrwfpsqaqrmacqmxmniusggdzrllrkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361674.6786606-718-171793047626951/AnsiballZ_systemd_service.py'
Dec 10 10:14:34 compute-0 sudo[174219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:35 compute-0 python3.9[174221]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:35 compute-0 sudo[174219]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:35 compute-0 sudo[174372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgnzhesmswcsnrjcwsdthpomwjacdxhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361675.4953396-718-108903647840420/AnsiballZ_systemd_service.py'
Dec 10 10:14:35 compute-0 sudo[174372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:35 compute-0 podman[174374]: 2025-12-10 10:14:35.8975086 +0000 UTC m=+0.080403782 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 10 10:14:36 compute-0 python3.9[174375]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:36 compute-0 sudo[174372]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:36 compute-0 sudo[174544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygmtvgofsqzffhignmvvumlrelrygxmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361676.2865412-718-236403719779770/AnsiballZ_systemd_service.py'
Dec 10 10:14:36 compute-0 sudo[174544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:36 compute-0 python3.9[174546]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:36 compute-0 sudo[174544]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:37 compute-0 sudo[174697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xciykesjatzlzzllvwvpqtcjbkyhsycd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361677.085427-718-83489399268933/AnsiballZ_systemd_service.py'
Dec 10 10:14:37 compute-0 sudo[174697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:37 compute-0 python3.9[174699]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:37 compute-0 sudo[174697]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:38 compute-0 sudo[174850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfmekgdudhpllhstzruszddppgtmxzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361677.9028108-718-187121632532199/AnsiballZ_systemd_service.py'
Dec 10 10:14:38 compute-0 sudo[174850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:38 compute-0 python3.9[174852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:14:38 compute-0 sudo[174850]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:39 compute-0 sudo[175003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubacjgzttwszvlagjilnucxhepuajrdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361678.8779767-777-6360698047694/AnsiballZ_file.py'
Dec 10 10:14:39 compute-0 sudo[175003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:39 compute-0 python3.9[175005]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:39 compute-0 sudo[175003]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:39 compute-0 sudo[175155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksbdtxqjfkvequpnkxsfsnkqoqgdqjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361679.527858-777-6172492431905/AnsiballZ_file.py'
Dec 10 10:14:39 compute-0 sudo[175155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:39 compute-0 python3.9[175157]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:39 compute-0 sudo[175155]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:40 compute-0 sudo[175307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-earmaqrdvotqgqsuosnsnviytpuukrky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361680.1145418-777-2876643539030/AnsiballZ_file.py'
Dec 10 10:14:40 compute-0 sudo[175307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:40 compute-0 python3.9[175309]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:40 compute-0 sudo[175307]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:41 compute-0 sudo[175459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dibqovhgejmlwvkkaghqxnddjwgonhcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361680.8244314-777-115365831062027/AnsiballZ_file.py'
Dec 10 10:14:41 compute-0 sudo[175459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:41 compute-0 python3.9[175461]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:41 compute-0 sudo[175459]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:41 compute-0 sudo[175611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guuotubjqushlapxoizervfpjxzqttzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361681.478192-777-233028147767235/AnsiballZ_file.py'
Dec 10 10:14:41 compute-0 sudo[175611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:41 compute-0 python3.9[175613]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:41 compute-0 sudo[175611]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:42 compute-0 podman[175614]: 2025-12-10 10:14:42.027014907 +0000 UTC m=+0.073981003 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:14:42 compute-0 sudo[175802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrjmvshzgrmbrpyymfipmyfctutxzeom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361682.0334823-777-149024018583290/AnsiballZ_file.py'
Dec 10 10:14:42 compute-0 sudo[175802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:42 compute-0 podman[175763]: 2025-12-10 10:14:42.319262384 +0000 UTC m=+0.075168453 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:14:42 compute-0 python3.9[175811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:42 compute-0 sudo[175802]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:42 compute-0 sudo[175961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efaldqqjxjvjrziddnpsdzxztrejtubc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361682.637526-777-147994950827743/AnsiballZ_file.py'
Dec 10 10:14:42 compute-0 sudo[175961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:43 compute-0 python3.9[175963]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:43 compute-0 sudo[175961]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:43 compute-0 sudo[176113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyueqnpmmdwvyjugenbspgzwqvfxxlbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361683.2302406-777-150215725157812/AnsiballZ_file.py'
Dec 10 10:14:43 compute-0 sudo[176113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:43 compute-0 python3.9[176115]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:43 compute-0 sudo[176113]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:44 compute-0 sudo[176265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqlkugeztlhyaswgnkgwehshsprymeac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361683.8183172-834-99981090689750/AnsiballZ_file.py'
Dec 10 10:14:44 compute-0 sudo[176265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:44 compute-0 python3.9[176267]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:44 compute-0 sudo[176265]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:44 compute-0 sudo[176417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahhgztaycluxjfrumflgdwodaxejgmqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361684.3769886-834-98969058882972/AnsiballZ_file.py'
Dec 10 10:14:44 compute-0 sudo[176417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:44 compute-0 python3.9[176419]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:44 compute-0 sudo[176417]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:45 compute-0 sudo[176569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvvphnjtvycnlvsijlnyktcbqmgdyks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361684.9614058-834-60594904265982/AnsiballZ_file.py'
Dec 10 10:14:45 compute-0 sudo[176569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:45 compute-0 python3.9[176571]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:45 compute-0 sudo[176569]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:45 compute-0 sudo[176721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nellqzlqlncxgsihpdanejhcllzysqeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361685.5426726-834-160468066881296/AnsiballZ_file.py'
Dec 10 10:14:45 compute-0 sudo[176721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:45 compute-0 python3.9[176723]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:45 compute-0 sudo[176721]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:46 compute-0 sudo[176873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqzcmpjxcsuyrmawzhltsuusnwmzumvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361686.1252654-834-134302935879301/AnsiballZ_file.py'
Dec 10 10:14:46 compute-0 sudo[176873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:46 compute-0 python3.9[176875]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:46 compute-0 sudo[176873]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:47 compute-0 sudo[177025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kapbrxhxshbxtgyyqgjcxaclxkmjasal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361686.7980442-834-230647796200423/AnsiballZ_file.py'
Dec 10 10:14:47 compute-0 sudo[177025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:47 compute-0 python3.9[177027]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:47 compute-0 sudo[177025]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:47 compute-0 sudo[177177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbzancpniahhhgislsadbbeephfessby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361687.4243472-834-202301531636953/AnsiballZ_file.py'
Dec 10 10:14:47 compute-0 sudo[177177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:47 compute-0 python3.9[177179]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:47 compute-0 sudo[177177]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:48 compute-0 sudo[177329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzfhufuwxvtqmyrvaykmrxerejhlvue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361687.9956005-834-200141369210546/AnsiballZ_file.py'
Dec 10 10:14:48 compute-0 sudo[177329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:48 compute-0 python3.9[177331]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:14:48 compute-0 sudo[177329]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:48 compute-0 sudo[177481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zphsujvwqxlyjmzcknkqahlywjoenzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361688.683538-892-228062780661604/AnsiballZ_command.py'
Dec 10 10:14:48 compute-0 sudo[177481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:49 compute-0 python3.9[177483]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:49 compute-0 sudo[177481]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:50 compute-0 python3.9[177635]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:14:50 compute-0 sudo[177785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsmtsiermoqxuvfdivmsczeezixdhvbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361690.4379735-910-7577787323288/AnsiballZ_systemd_service.py'
Dec 10 10:14:50 compute-0 sudo[177785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:51 compute-0 python3.9[177787]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:14:51 compute-0 systemd[1]: Reloading.
Dec 10 10:14:51 compute-0 systemd-sysv-generator[177818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:14:51 compute-0 systemd-rc-local-generator[177814]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:14:51 compute-0 sudo[177785]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:51 compute-0 sudo[177972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcrswcvhzrhyvqgxutzgqitzgseldwwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361691.5436623-918-159547659409181/AnsiballZ_command.py'
Dec 10 10:14:51 compute-0 sudo[177972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:52 compute-0 python3.9[177974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:52 compute-0 sudo[177972]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:52 compute-0 sudo[178125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjsrbpyiehyatvcrzwdlqakemsafiwhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361692.2439773-918-213900662303436/AnsiballZ_command.py'
Dec 10 10:14:52 compute-0 sudo[178125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:52 compute-0 python3.9[178127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:52 compute-0 sudo[178125]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:53 compute-0 sudo[178278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzpchxqlahrplwehbuevggiwmymmkpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361692.9028494-918-111465148074523/AnsiballZ_command.py'
Dec 10 10:14:53 compute-0 sudo[178278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:53 compute-0 python3.9[178280]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:53 compute-0 sudo[178278]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:53 compute-0 sudo[178431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwpqvtaxufqlaajnjywsvnrjubeslcry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361693.5900111-918-241378443544941/AnsiballZ_command.py'
Dec 10 10:14:53 compute-0 sudo[178431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:54 compute-0 python3.9[178433]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:54 compute-0 sudo[178431]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:54 compute-0 sudo[178584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clztkrmnbrnaazegplobkrkfxlryqsqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361694.1470268-918-243965497824101/AnsiballZ_command.py'
Dec 10 10:14:54 compute-0 sudo[178584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:54 compute-0 python3.9[178586]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:54 compute-0 sudo[178584]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:54 compute-0 sudo[178737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdfcdclbnpwrltiyopselxnlayjjbvus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361694.748236-918-21946183580679/AnsiballZ_command.py'
Dec 10 10:14:54 compute-0 sudo[178737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:55 compute-0 python3.9[178739]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:55 compute-0 sudo[178737]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:55 compute-0 sudo[178890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvatokpyrizrjnjxxqeofquxcxyigeoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361695.3421652-918-179508052581542/AnsiballZ_command.py'
Dec 10 10:14:55 compute-0 sudo[178890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:55 compute-0 python3.9[178892]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:55 compute-0 sudo[178890]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:56 compute-0 sudo[179043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aokyyjljcgjfenqwicxuuvwyyjcpwzlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361695.9933507-918-88064281011714/AnsiballZ_command.py'
Dec 10 10:14:56 compute-0 sudo[179043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:56 compute-0 python3.9[179045]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:14:56 compute-0 sudo[179043]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:57 compute-0 sudo[179196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzlsenmmgxhiwpuymvslivsqqfnlautu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361697.4607816-997-141579659372585/AnsiballZ_file.py'
Dec 10 10:14:57 compute-0 sudo[179196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:57 compute-0 python3.9[179198]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:14:57 compute-0 sudo[179196]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:58 compute-0 sudo[179348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmhvseilctmnyeernktzuupzowqfcjmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361698.1187034-997-277542086479153/AnsiballZ_file.py'
Dec 10 10:14:58 compute-0 sudo[179348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:58 compute-0 python3.9[179350]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:14:58 compute-0 sudo[179348]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:59 compute-0 sudo[179500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fggavicjjwxsfelsabsnakpojytorunf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361698.7378142-997-208443009496029/AnsiballZ_file.py'
Dec 10 10:14:59 compute-0 sudo[179500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:59 compute-0 python3.9[179502]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:14:59 compute-0 sudo[179500]: pam_unix(sudo:session): session closed for user root
Dec 10 10:14:59 compute-0 sudo[179652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbdukdaakkhjjkrvlnrwzygityiugfip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361699.419456-1019-266332404070176/AnsiballZ_file.py'
Dec 10 10:14:59 compute-0 sudo[179652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:14:59 compute-0 python3.9[179654]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:14:59 compute-0 sudo[179652]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:00 compute-0 sudo[179804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obdbkokzswvonomnfvhrrdedhxzvntim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361700.0434968-1019-397059249769/AnsiballZ_file.py'
Dec 10 10:15:00 compute-0 sudo[179804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:00 compute-0 python3.9[179806]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:00 compute-0 sudo[179804]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:00 compute-0 sudo[179956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzkqmuaxctpfouggncazderfbsdkicxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361700.6933434-1019-234669025791627/AnsiballZ_file.py'
Dec 10 10:15:00 compute-0 sudo[179956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:01 compute-0 python3.9[179958]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:01 compute-0 sudo[179956]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:01 compute-0 sudo[180108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmahqkvyukdsiwwuhlzohrbszrfonsnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361701.3030598-1019-29249242579253/AnsiballZ_file.py'
Dec 10 10:15:01 compute-0 sudo[180108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:01 compute-0 python3.9[180110]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:01 compute-0 sudo[180108]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:02 compute-0 sudo[180260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tddncryhqxlvijmjwzuojqkgcxvjbatv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361701.9121594-1019-216144582065103/AnsiballZ_file.py'
Dec 10 10:15:02 compute-0 sudo[180260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:02 compute-0 python3.9[180262]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:02 compute-0 sudo[180260]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:02 compute-0 sudo[180412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmuhjkjsnynoavlmdsrgajczcmjbburz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361702.47729-1019-31683127290042/AnsiballZ_file.py'
Dec 10 10:15:02 compute-0 sudo[180412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:03 compute-0 python3.9[180414]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:03 compute-0 sudo[180412]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:03 compute-0 sudo[180564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejodfjppzsgtqppabbtocsovxjbcyvrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361703.1971323-1019-264499648931388/AnsiballZ_file.py'
Dec 10 10:15:03 compute-0 sudo[180564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:03 compute-0 python3.9[180566]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:03 compute-0 sudo[180564]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:06 compute-0 podman[180591]: 2025-12-10 10:15:06.01316093 +0000 UTC m=+0.059306314 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:15:07 compute-0 sudo[180736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzfxjzaxsonqkljdafqswmskbrzgzxef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361707.572079-1188-96543357564452/AnsiballZ_getent.py'
Dec 10 10:15:07 compute-0 sudo[180736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:08 compute-0 python3.9[180738]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 10 10:15:08 compute-0 sudo[180736]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:08 compute-0 sudo[180889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqiklcicgwyprsabjjorenptvuqfgecp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361708.4116185-1196-69129932653765/AnsiballZ_group.py'
Dec 10 10:15:08 compute-0 sudo[180889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:09 compute-0 python3.9[180891]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:15:09 compute-0 groupadd[180892]: group added to /etc/group: name=nova, GID=42436
Dec 10 10:15:09 compute-0 groupadd[180892]: group added to /etc/gshadow: name=nova
Dec 10 10:15:09 compute-0 groupadd[180892]: new group: name=nova, GID=42436
Dec 10 10:15:09 compute-0 sudo[180889]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:09 compute-0 sudo[181047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgsjnmjkykqhplkcaxzkgjxybtcdqayq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361709.3144615-1204-133700074260769/AnsiballZ_user.py'
Dec 10 10:15:09 compute-0 sudo[181047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:10 compute-0 python3.9[181049]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 10 10:15:10 compute-0 useradd[181051]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 10 10:15:10 compute-0 useradd[181051]: add 'nova' to group 'libvirt'
Dec 10 10:15:10 compute-0 useradd[181051]: add 'nova' to shadow group 'libvirt'
Dec 10 10:15:10 compute-0 sudo[181047]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:10 compute-0 sshd-session[181082]: Accepted publickey for zuul from 192.168.122.30 port 50232 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:15:10 compute-0 systemd-logind[787]: New session 25 of user zuul.
Dec 10 10:15:11 compute-0 systemd[1]: Started Session 25 of User zuul.
Dec 10 10:15:11 compute-0 sshd-session[181082]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:15:11 compute-0 sshd-session[181085]: Received disconnect from 192.168.122.30 port 50232:11: disconnected by user
Dec 10 10:15:11 compute-0 sshd-session[181085]: Disconnected from user zuul 192.168.122.30 port 50232
Dec 10 10:15:11 compute-0 sshd-session[181082]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:15:11 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Dec 10 10:15:11 compute-0 systemd-logind[787]: Session 25 logged out. Waiting for processes to exit.
Dec 10 10:15:11 compute-0 systemd-logind[787]: Removed session 25.
Dec 10 10:15:11 compute-0 python3.9[181235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:12 compute-0 podman[181330]: 2025-12-10 10:15:12.307638682 +0000 UTC m=+0.104689434 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:15:12 compute-0 python3.9[181371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361711.3370345-1229-29390447796299/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:12 compute-0 podman[181384]: 2025-12-10 10:15:12.534514657 +0000 UTC m=+0.083153044 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_managed=true)
Dec 10 10:15:13 compute-0 python3.9[181551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:13 compute-0 python3.9[181627]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:14 compute-0 python3.9[181777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:14 compute-0 python3.9[181898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361713.685573-1229-91115741015080/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:15 compute-0 python3.9[182048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:15 compute-0 python3.9[182169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361714.8119886-1229-29841533681952/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:16 compute-0 python3.9[182319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:17 compute-0 python3.9[182440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361716.0874443-1229-129779045877729/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:17 compute-0 python3.9[182590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:18 compute-0 python3.9[182711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361717.316704-1229-55324122926767/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:18 compute-0 sudo[182861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pocdowmvjmhyzfwedciixxmzuibelupo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361718.5852168-1312-6330453769768/AnsiballZ_file.py'
Dec 10 10:15:18 compute-0 sudo[182861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:19 compute-0 python3.9[182863]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:19 compute-0 sudo[182861]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:19 compute-0 sudo[183013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thqdpbpdjktcwivybetzrxgssnartvuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361719.377502-1320-53488481450575/AnsiballZ_copy.py'
Dec 10 10:15:19 compute-0 sudo[183013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:19 compute-0 python3.9[183015]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:19 compute-0 sudo[183013]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:20 compute-0 sudo[183165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmidhhwggmzpcfqxpyvifqrkugcjlvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361720.0155687-1328-94281444817044/AnsiballZ_stat.py'
Dec 10 10:15:20 compute-0 sudo[183165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:20 compute-0 python3.9[183167]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:20 compute-0 sudo[183165]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:21 compute-0 sudo[183317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luygavnuypglfcimggzavfiddalcxdpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361720.7605565-1336-145731892572511/AnsiballZ_stat.py'
Dec 10 10:15:21 compute-0 sudo[183317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:21 compute-0 python3.9[183319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:21 compute-0 sudo[183317]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:21 compute-0 sudo[183440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrehidjqyzweoztcdvmbpppjxuxronsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361720.7605565-1336-145731892572511/AnsiballZ_copy.py'
Dec 10 10:15:21 compute-0 sudo[183440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:21 compute-0 python3.9[183442]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765361720.7605565-1336-145731892572511/.source _original_basename=.b8gixnrv follow=False checksum=52cad496b3ac3c41ccfa1b04654020de3e4bcda6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 10 10:15:21 compute-0 sudo[183440]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:22 compute-0 python3.9[183594]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:23 compute-0 python3.9[183746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:23 compute-0 python3.9[183867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361722.8192594-1362-261479007816135/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:24 compute-0 python3.9[184017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:15:25 compute-0 python3.9[184138]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361723.9975061-1377-162966832096885/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:15:25 compute-0 sudo[184288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urrqkknhhxkqlaudvdqmldqpvrvtypvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361725.373557-1394-186594929636611/AnsiballZ_container_config_data.py'
Dec 10 10:15:25 compute-0 sudo[184288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:25 compute-0 python3.9[184290]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 10 10:15:25 compute-0 sudo[184288]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:26 compute-0 sudo[184440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltahwwpptcoppbmyzqafhklnublgmqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361726.1604285-1403-222101553112462/AnsiballZ_container_config_hash.py'
Dec 10 10:15:26 compute-0 sudo[184440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:26 compute-0 python3.9[184442]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:15:26 compute-0 sudo[184440]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:27 compute-0 sudo[184592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktopkvmeqmxlmranocktcgziiolztid ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361727.0271454-1413-181186841547336/AnsiballZ_edpm_container_manage.py'
Dec 10 10:15:27 compute-0 sudo[184592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:27 compute-0 python3[184594]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:15:27 compute-0 podman[184632]: 2025-12-10 10:15:27.872914054 +0000 UTC m=+0.067702220 container create f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 10 10:15:27 compute-0 podman[184632]: 2025-12-10 10:15:27.838648654 +0000 UTC m=+0.033436800 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 10 10:15:27 compute-0 python3[184594]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 10 10:15:28 compute-0 sudo[184592]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:28 compute-0 sudo[184820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhtwfzunhtkzljhnstrypvjiajvngck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361728.1589732-1421-75544579842155/AnsiballZ_stat.py'
Dec 10 10:15:28 compute-0 sudo[184820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:28 compute-0 python3.9[184822]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:28 compute-0 sudo[184820]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:29 compute-0 sudo[184974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jasurjrzdznkaigpkoscajsvrtpurczd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361729.1301284-1433-44117812894169/AnsiballZ_container_config_data.py'
Dec 10 10:15:29 compute-0 sudo[184974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:29 compute-0 python3.9[184976]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 10 10:15:29 compute-0 sudo[184974]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:30 compute-0 sudo[185126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkbediaxirrmbswpdyavzqvixsulfjur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361729.8070185-1442-268474507308721/AnsiballZ_container_config_hash.py'
Dec 10 10:15:30 compute-0 sudo[185126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:30 compute-0 python3.9[185128]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:15:30 compute-0 sudo[185126]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:30 compute-0 sudo[185278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plmutynzvcdmqjuloczvycnvzfxcdvwn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361730.6632934-1452-96256603871442/AnsiballZ_edpm_container_manage.py'
Dec 10 10:15:30 compute-0 sudo[185278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:31 compute-0 python3[185280]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:15:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:15:31.455 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:15:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:15:31.457 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:15:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:15:31.457 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:15:31 compute-0 podman[185319]: 2025-12-10 10:15:31.499215686 +0000 UTC m=+0.052441651 container create b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20251202, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:15:31 compute-0 podman[185319]: 2025-12-10 10:15:31.473123835 +0000 UTC m=+0.026349800 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 10 10:15:31 compute-0 python3[185280]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 10 10:15:31 compute-0 sudo[185278]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:32 compute-0 sudo[185508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncrqlgcihpyzzaqcfddkzkmvdanicxtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361731.8212879-1460-193574323184871/AnsiballZ_stat.py'
Dec 10 10:15:32 compute-0 sudo[185508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:32 compute-0 python3.9[185510]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:32 compute-0 sudo[185508]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:32 compute-0 sudo[185662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gehgqdvpbxdgrrxrqekkxuqegarvzzxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361732.5833254-1469-197395951211387/AnsiballZ_file.py'
Dec 10 10:15:32 compute-0 sudo[185662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:33 compute-0 python3.9[185664]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:33 compute-0 sudo[185662]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:33 compute-0 sudo[185813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hocqcrvvaayegmoqhmtcexdbzymsfjin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361733.1516056-1469-209039139140760/AnsiballZ_copy.py'
Dec 10 10:15:33 compute-0 sudo[185813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:33 compute-0 python3.9[185815]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361733.1516056-1469-209039139140760/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:33 compute-0 sudo[185813]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:34 compute-0 sudo[185889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugmxmbeenemmmfjxlhhfszwtpwquiluo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361733.1516056-1469-209039139140760/AnsiballZ_systemd.py'
Dec 10 10:15:34 compute-0 sudo[185889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:34 compute-0 python3.9[185891]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:15:34 compute-0 systemd[1]: Reloading.
Dec 10 10:15:34 compute-0 systemd-sysv-generator[185920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:15:34 compute-0 systemd-rc-local-generator[185915]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:15:34 compute-0 sudo[185889]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:34 compute-0 sudo[186001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skjpagzxnxqwkozjwbkvjyoezbuqtlvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361733.1516056-1469-209039139140760/AnsiballZ_systemd.py'
Dec 10 10:15:34 compute-0 sudo[186001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:35 compute-0 python3.9[186003]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:15:35 compute-0 systemd[1]: Reloading.
Dec 10 10:15:35 compute-0 systemd-sysv-generator[186035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:15:35 compute-0 systemd-rc-local-generator[186032]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:15:35 compute-0 systemd[1]: Starting nova_compute container...
Dec 10 10:15:35 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:15:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:35 compute-0 podman[186042]: 2025-12-10 10:15:35.857092114 +0000 UTC m=+0.193072079 container init b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 10 10:15:35 compute-0 podman[186042]: 2025-12-10 10:15:35.866764164 +0000 UTC m=+0.202744029 container start b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3)
Dec 10 10:15:35 compute-0 podman[186042]: nova_compute
Dec 10 10:15:35 compute-0 nova_compute[186057]: + sudo -E kolla_set_configs
Dec 10 10:15:35 compute-0 systemd[1]: Started nova_compute container.
Dec 10 10:15:35 compute-0 sudo[186001]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Validating config file
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying service configuration files
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Deleting /etc/ceph
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Creating directory /etc/ceph
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /etc/ceph
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Writing out command to execute
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:35 compute-0 nova_compute[186057]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 10 10:15:35 compute-0 nova_compute[186057]: ++ cat /run_command
Dec 10 10:15:35 compute-0 nova_compute[186057]: + CMD=nova-compute
Dec 10 10:15:35 compute-0 nova_compute[186057]: + ARGS=
Dec 10 10:15:35 compute-0 nova_compute[186057]: + sudo kolla_copy_cacerts
Dec 10 10:15:36 compute-0 nova_compute[186057]: + [[ ! -n '' ]]
Dec 10 10:15:36 compute-0 nova_compute[186057]: + . kolla_extend_start
Dec 10 10:15:36 compute-0 nova_compute[186057]: + echo 'Running command: '\''nova-compute'\'''
Dec 10 10:15:36 compute-0 nova_compute[186057]: Running command: 'nova-compute'
Dec 10 10:15:36 compute-0 nova_compute[186057]: + umask 0022
Dec 10 10:15:36 compute-0 nova_compute[186057]: + exec nova-compute
Dec 10 10:15:36 compute-0 podman[186192]: 2025-12-10 10:15:36.694930285 +0000 UTC m=+0.094181111 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 10 10:15:36 compute-0 python3.9[186229]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:37 compute-0 python3.9[186387]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:37 compute-0 nova_compute[186057]: 2025-12-10 10:15:37.989 186061 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:37 compute-0 nova_compute[186057]: 2025-12-10 10:15:37.990 186061 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:37 compute-0 nova_compute[186057]: 2025-12-10 10:15:37.990 186061 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:37 compute-0 nova_compute[186057]: 2025-12-10 10:15:37.990 186061 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.127 186061 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.150 186061 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.151 186061 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 10 10:15:38 compute-0 python3.9[186539]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.733 186061 INFO nova.virt.driver [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.859 186061 INFO nova.compute.provider_config [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.897 186061 DEBUG oslo_concurrency.lockutils [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.898 186061 DEBUG oslo_concurrency.lockutils [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.898 186061 DEBUG oslo_concurrency.lockutils [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.899 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.899 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.899 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.899 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.900 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.900 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.900 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.901 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.901 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.901 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.901 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.902 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.902 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.902 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.903 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.903 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.904 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.904 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.904 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.905 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.905 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.905 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.906 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.906 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.906 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.907 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.907 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.908 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.908 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.908 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.909 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.909 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.909 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.910 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.910 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.911 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.911 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.911 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.912 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.912 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.912 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.913 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.913 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.913 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.914 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.914 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.914 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.914 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.915 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.915 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.915 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.915 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.916 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.916 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.916 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.917 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.917 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.917 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.918 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.918 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.918 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.918 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.919 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.919 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.919 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.919 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.920 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.920 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.920 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.921 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.921 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.921 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.922 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.922 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.922 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.922 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.923 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.923 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.923 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.923 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.924 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.924 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.924 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.924 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.926 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.926 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.926 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.926 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.927 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.927 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.927 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.927 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.927 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.928 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.928 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.928 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.928 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.928 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.929 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.929 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.929 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.930 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.930 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.930 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.930 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.930 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.931 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.931 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.931 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.931 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.931 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.932 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.932 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.932 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.932 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.932 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.933 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.933 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.933 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.934 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.934 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.934 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.934 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.934 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.935 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.935 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.935 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.935 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.935 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.936 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.936 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.936 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.936 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.937 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.937 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.937 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.937 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.937 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.938 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.939 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.939 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.939 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.939 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.940 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.940 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.940 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.940 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.941 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.941 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.941 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.941 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.941 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.942 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.943 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.944 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.945 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.946 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.947 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.948 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.949 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.950 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.951 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.952 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.953 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.953 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.953 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.953 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.953 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.954 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.954 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.954 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.954 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.955 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.956 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.957 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.958 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.959 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.959 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.959 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.959 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.959 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.960 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.961 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.962 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.963 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.964 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.965 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.966 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.967 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.968 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.969 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.970 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.971 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.971 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.971 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.971 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.972 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.972 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.972 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.972 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.972 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.973 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.973 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.973 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.973 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.973 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.974 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.975 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.976 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.977 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.978 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.979 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.980 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.980 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.980 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.980 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.980 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.981 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.982 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.983 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.984 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.985 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.986 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.987 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.988 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.989 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.989 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.989 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.989 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.990 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.990 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.990 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.990 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.990 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.991 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.991 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.991 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.991 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.991 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.992 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.993 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.994 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.995 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.995 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.995 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.995 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.995 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 WARNING oslo_config.cfg [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 10 10:15:38 compute-0 nova_compute[186057]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 10 10:15:38 compute-0 nova_compute[186057]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 10 10:15:38 compute-0 nova_compute[186057]: and ``live_migration_inbound_addr`` respectively.
Dec 10 10:15:38 compute-0 nova_compute[186057]: ).  Its value may be silently ignored in the future.
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.996 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.997 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.998 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:38 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:38.999 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.000 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.001 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.002 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.003 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.004 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.005 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.006 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.007 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.008 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.009 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.010 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.011 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.012 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.013 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.014 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.015 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.015 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.015 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.015 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.015 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.016 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.017 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.018 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.019 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.020 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.021 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.022 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.023 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.024 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.025 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.026 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.027 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.027 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.027 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.027 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.027 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.028 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.029 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.030 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.031 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.032 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.033 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.033 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.033 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.033 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.034 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.035 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.036 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.037 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.038 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.039 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.040 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.041 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.042 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.043 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.044 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.045 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.046 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.047 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.048 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.049 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.050 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.051 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.052 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.053 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.054 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.055 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.056 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.057 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.058 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.059 186061 DEBUG oslo_service.service [None req-956828d9-631d-4b28-8dcf-2f93ae09907b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.060 186061 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.076 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.076 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.077 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.077 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 10 10:15:39 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 10 10:15:39 compute-0 sudo[186701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaztiswuwemcobeljripfhlajnaiasvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361738.5486054-1529-223457331949994/AnsiballZ_podman_container.py'
Dec 10 10:15:39 compute-0 sudo[186701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:39 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.155 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe8cad06ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.159 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe8cad06ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.160 186061 INFO nova.virt.libvirt.driver [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Connection event '1' reason 'None'
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.180 186061 WARNING nova.virt.libvirt.driver [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 10 10:15:39 compute-0 nova_compute[186057]: 2025-12-10 10:15:39.180 186061 DEBUG nova.virt.libvirt.volume.mount [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 10 10:15:39 compute-0 python3.9[186715]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 10 10:15:39 compute-0 sudo[186701]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:39 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:15:39 compute-0 sudo[186926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpqrlpiyunsregrskufyyncwnpclmxud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361739.6538205-1537-233994630889765/AnsiballZ_systemd.py'
Dec 10 10:15:39 compute-0 sudo[186926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.050 186061 INFO nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Libvirt host capabilities <capabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]: 
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <host>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <uuid>4f9a932e-d23a-4638-b69b-16fdca20f7f4</uuid>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <arch>x86_64</arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model>EPYC-Rome-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <vendor>AMD</vendor>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <microcode version='16777317'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <signature family='23' model='49' stepping='0'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='x2apic'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='tsc-deadline'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='osxsave'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='hypervisor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='tsc_adjust'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='spec-ctrl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='stibp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='arch-capabilities'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='cmp_legacy'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='topoext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='virt-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='lbrv'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='tsc-scale'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='vmcb-clean'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='pause-filter'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='pfthreshold'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='svme-addr-chk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='rdctl-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='skip-l1dfl-vmentry'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='mds-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature name='pschange-mc-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <pages unit='KiB' size='4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <pages unit='KiB' size='2048'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <pages unit='KiB' size='1048576'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <power_management>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <suspend_mem/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <suspend_disk/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <suspend_hybrid/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </power_management>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <iommu support='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <migration_features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <live/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <uri_transports>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <uri_transport>tcp</uri_transport>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <uri_transport>rdma</uri_transport>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </uri_transports>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </migration_features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <topology>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <cells num='1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <cell id='0'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <memory unit='KiB'>7864300</memory>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <pages unit='KiB' size='4'>1966075</pages>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <pages unit='KiB' size='2048'>0</pages>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <distances>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <sibling id='0' value='10'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           </distances>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           <cpus num='8'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:           </cpus>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         </cell>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </cells>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </topology>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <cache>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </cache>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <secmodel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model>selinux</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <doi>0</doi>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </secmodel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <secmodel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model>dac</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <doi>0</doi>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </secmodel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </host>
Dec 10 10:15:40 compute-0 nova_compute[186057]: 
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <guest>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <os_type>hvm</os_type>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <arch name='i686'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <wordsize>32</wordsize>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <domain type='qemu'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <domain type='kvm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <pae/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <nonpae/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <acpi default='on' toggle='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <apic default='on' toggle='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <cpuselection/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <deviceboot/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <disksnapshot default='on' toggle='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <externalSnapshot/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </guest>
Dec 10 10:15:40 compute-0 nova_compute[186057]: 
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <guest>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <os_type>hvm</os_type>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <arch name='x86_64'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <wordsize>64</wordsize>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <domain type='qemu'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <domain type='kvm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <acpi default='on' toggle='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <apic default='on' toggle='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <cpuselection/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <deviceboot/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <disksnapshot default='on' toggle='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <externalSnapshot/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </guest>
Dec 10 10:15:40 compute-0 nova_compute[186057]: 
Dec 10 10:15:40 compute-0 nova_compute[186057]: </capabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]: 
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.057 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.076 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 10 10:15:40 compute-0 nova_compute[186057]: <domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <domain>kvm</domain>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <arch>i686</arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <vcpu max='240'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <iothreads supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <os supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='firmware'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <loader supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>rom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pflash</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='readonly'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>yes</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='secure'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </loader>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </os>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='maximumMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <vendor>AMD</vendor>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='succor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='custom' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-128'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-256'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-512'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <memoryBacking supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='sourceType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>anonymous</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>memfd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </memoryBacking>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <disk supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='diskDevice'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>disk</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cdrom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>floppy</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>lun</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ide</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>fdc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>sata</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </disk>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <graphics supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vnc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egl-headless</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </graphics>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <video supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='modelType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vga</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cirrus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>none</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>bochs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ramfb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </video>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hostdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='mode'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>subsystem</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='startupPolicy'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>mandatory</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>requisite</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>optional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='subsysType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pci</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='capsType'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='pciBackend'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hostdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <rng supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>random</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </rng>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <filesystem supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='driverType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>path</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>handle</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtiofs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </filesystem>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <tpm supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-tis</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-crb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emulator</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>external</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendVersion'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>2.0</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </tpm>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <redirdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </redirdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <channel supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </channel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <crypto supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </crypto>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <interface supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>passt</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </interface>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <panic supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>isa</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>hyperv</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </panic>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <console supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>null</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dev</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pipe</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stdio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>udp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tcp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu-vdagent</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </console>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <gic supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <genid supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backup supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <async-teardown supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <ps2 supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sev supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sgx supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hyperv supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='features'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>relaxed</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vapic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>spinlocks</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vpindex</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>runtime</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>synic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stimer</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reset</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vendor_id</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>frequencies</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reenlightenment</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tlbflush</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ipi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>avic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emsr_bitmap</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>xmm_input</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hyperv>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <launchSecurity supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='sectype'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tdx</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </launchSecurity>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]: </domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.081 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 10 10:15:40 compute-0 nova_compute[186057]: <domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <domain>kvm</domain>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <arch>i686</arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <vcpu max='4096'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <iothreads supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <os supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='firmware'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <loader supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>rom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pflash</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='readonly'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>yes</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='secure'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </loader>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </os>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='maximumMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <vendor>AMD</vendor>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='succor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='custom' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-128'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-256'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-512'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <memoryBacking supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='sourceType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>anonymous</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>memfd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </memoryBacking>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <disk supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='diskDevice'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>disk</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cdrom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>floppy</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>lun</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>fdc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>sata</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </disk>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <graphics supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vnc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egl-headless</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </graphics>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <video supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='modelType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vga</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cirrus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>none</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>bochs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ramfb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </video>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hostdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='mode'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>subsystem</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='startupPolicy'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>mandatory</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>requisite</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>optional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='subsysType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pci</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='capsType'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='pciBackend'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hostdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <rng supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>random</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </rng>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <filesystem supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='driverType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>path</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>handle</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtiofs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </filesystem>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <tpm supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-tis</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-crb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emulator</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>external</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendVersion'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>2.0</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </tpm>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <redirdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </redirdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <channel supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </channel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <crypto supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </crypto>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <interface supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>passt</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </interface>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <panic supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>isa</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>hyperv</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </panic>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <console supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>null</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dev</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pipe</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stdio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>udp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tcp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu-vdagent</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </console>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <gic supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <genid supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backup supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <async-teardown supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <ps2 supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sev supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sgx supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hyperv supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='features'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>relaxed</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vapic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>spinlocks</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vpindex</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>runtime</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>synic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stimer</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reset</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vendor_id</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>frequencies</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reenlightenment</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tlbflush</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ipi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>avic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emsr_bitmap</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>xmm_input</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hyperv>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <launchSecurity supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='sectype'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tdx</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </launchSecurity>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]: </domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.112 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.116 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 10 10:15:40 compute-0 nova_compute[186057]: <domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <domain>kvm</domain>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <arch>x86_64</arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <vcpu max='240'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <iothreads supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <os supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='firmware'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <loader supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>rom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pflash</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='readonly'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>yes</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='secure'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </loader>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </os>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='maximumMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <vendor>AMD</vendor>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='succor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='custom' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-128'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-256'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-512'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <memoryBacking supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='sourceType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>file</value>
Dec 10 10:15:40 compute-0 python3.9[186928]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>anonymous</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>memfd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </memoryBacking>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <disk supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='diskDevice'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>disk</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cdrom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>floppy</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>lun</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ide</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>fdc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>sata</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </disk>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <graphics supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vnc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egl-headless</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </graphics>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <video supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='modelType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vga</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cirrus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>none</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>bochs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ramfb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </video>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hostdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='mode'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>subsystem</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='startupPolicy'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>mandatory</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>requisite</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>optional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='subsysType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pci</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='capsType'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='pciBackend'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hostdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <rng supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>random</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </rng>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <filesystem supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='driverType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>path</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>handle</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtiofs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </filesystem>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <tpm supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-tis</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-crb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emulator</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>external</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendVersion'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>2.0</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </tpm>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <redirdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </redirdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <channel supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </channel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <crypto supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </crypto>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <interface supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>passt</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </interface>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <panic supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>isa</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>hyperv</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </panic>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <console supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>null</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dev</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pipe</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stdio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>udp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tcp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu-vdagent</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </console>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <gic supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <genid supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backup supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <async-teardown supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <ps2 supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sev supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sgx supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hyperv supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='features'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>relaxed</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vapic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>spinlocks</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vpindex</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>runtime</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>synic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stimer</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reset</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vendor_id</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>frequencies</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reenlightenment</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tlbflush</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ipi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>avic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emsr_bitmap</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>xmm_input</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hyperv>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <launchSecurity supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='sectype'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tdx</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </launchSecurity>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]: </domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.178 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 10 10:15:40 compute-0 nova_compute[186057]: <domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <domain>kvm</domain>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <arch>x86_64</arch>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <vcpu max='4096'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <iothreads supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <os supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='firmware'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>efi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <loader supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>rom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pflash</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='readonly'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>yes</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='secure'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>yes</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>no</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </loader>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </os>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='maximumMigratable'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>on</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>off</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <vendor>AMD</vendor>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='succor'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <mode name='custom' supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Denverton-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='auto-ibrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amd-psfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='stibp-always-on'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='EPYC-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-128'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-256'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx10-512'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='prefetchiti'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Haswell-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 systemd[1]: Stopping nova_compute container...
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512er'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512pf'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fma4'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tbm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xop'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='amx-tile'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-bf16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-fp16'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bitalg'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrc'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fzrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='la57'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='taa-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xfd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ifma'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cmpccxadd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fbsdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='fsrs'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ibrs-all'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mcdt-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pbrsb-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='psdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='serialize'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vaes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='hle'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='rtm'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512bw'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512cd'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512dq'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512f'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='avx512vl'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='invpcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pcid'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='pku'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='mpx'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='core-capability'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='split-lock-detect'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='cldemote'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='erms'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='gfni'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdir64b'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='movdiri'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='xsaves'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='athlon-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='core2duo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='coreduo-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='n270-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='ss'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <blockers model='phenom-v1'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnow'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <feature name='3dnowext'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </blockers>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </mode>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </cpu>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <memoryBacking supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <enum name='sourceType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>anonymous</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <value>memfd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </memoryBacking>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <disk supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='diskDevice'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>disk</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cdrom</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>floppy</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>lun</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>fdc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>sata</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </disk>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <graphics supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vnc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egl-headless</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </graphics>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <video supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='modelType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vga</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>cirrus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>none</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>bochs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ramfb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </video>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hostdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='mode'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>subsystem</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='startupPolicy'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>mandatory</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>requisite</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>optional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='subsysType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pci</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>scsi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='capsType'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='pciBackend'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hostdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <rng supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtio-non-transitional</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>random</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>egd</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </rng>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <filesystem supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='driverType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>path</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>handle</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>virtiofs</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </filesystem>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <tpm supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-tis</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tpm-crb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emulator</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>external</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendVersion'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>2.0</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </tpm>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <redirdev supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='bus'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>usb</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </redirdev>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <channel supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </channel>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <crypto supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendModel'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>builtin</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </crypto>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <interface supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='backendType'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>default</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>passt</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </interface>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <panic supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='model'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>isa</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>hyperv</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </panic>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <console supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='type'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>null</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vc</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pty</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dev</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>file</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>pipe</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stdio</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>udp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tcp</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>unix</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>qemu-vdagent</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>dbus</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </console>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </devices>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   <features>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <gic supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <genid supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <backup supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <async-teardown supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <ps2 supported='yes'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sev supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <sgx supported='no'/>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <hyperv supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='features'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>relaxed</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vapic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>spinlocks</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vpindex</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>runtime</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>synic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>stimer</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reset</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>vendor_id</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>frequencies</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>reenlightenment</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tlbflush</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>ipi</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>avic</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>emsr_bitmap</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>xmm_input</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </defaults>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </hyperv>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     <launchSecurity supported='yes'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       <enum name='sectype'>
Dec 10 10:15:40 compute-0 nova_compute[186057]:         <value>tdx</value>
Dec 10 10:15:40 compute-0 nova_compute[186057]:       </enum>
Dec 10 10:15:40 compute-0 nova_compute[186057]:     </launchSecurity>
Dec 10 10:15:40 compute-0 nova_compute[186057]:   </features>
Dec 10 10:15:40 compute-0 nova_compute[186057]: </domainCapabilities>
Dec 10 10:15:40 compute-0 nova_compute[186057]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.236 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.236 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.236 186061 DEBUG nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.236 186061 INFO nova.virt.libvirt.host [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Secure Boot support detected
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.239 186061 INFO nova.virt.libvirt.driver [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.240 186061 INFO nova.virt.libvirt.driver [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 10 10:15:40 compute-0 nova_compute[186057]: 2025-12-10 10:15:40.249 186061 DEBUG nova.virt.libvirt.driver [None req-f477f112-de9f-4b26-a946-e7c43819b9a5 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 10 10:15:40 compute-0 virtqemud[186713]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 10 10:15:40 compute-0 virtqemud[186713]: hostname: compute-0
Dec 10 10:15:40 compute-0 virtqemud[186713]: End of file while reading data: Input/output error
Dec 10 10:15:40 compute-0 systemd[1]: libpod-b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4.scope: Deactivated successfully.
Dec 10 10:15:40 compute-0 systemd[1]: libpod-b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4.scope: Consumed 2.717s CPU time.
Dec 10 10:15:40 compute-0 podman[186936]: 2025-12-10 10:15:40.382920735 +0000 UTC m=+0.081220923 container died b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:15:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4-userdata-shm.mount: Deactivated successfully.
Dec 10 10:15:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b-merged.mount: Deactivated successfully.
Dec 10 10:15:40 compute-0 podman[186936]: 2025-12-10 10:15:40.44006155 +0000 UTC m=+0.138361738 container cleanup b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 10 10:15:40 compute-0 podman[186936]: nova_compute
Dec 10 10:15:40 compute-0 podman[186962]: nova_compute
Dec 10 10:15:40 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 10 10:15:40 compute-0 systemd[1]: Stopped nova_compute container.
Dec 10 10:15:40 compute-0 systemd[1]: Starting nova_compute container...
Dec 10 10:15:40 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62555d6373f0bc57f99d72dc5d8e0c233dabd497c23d73123f5600004fe34c5b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:40 compute-0 podman[186976]: 2025-12-10 10:15:40.669005681 +0000 UTC m=+0.139021316 container init b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec 10 10:15:40 compute-0 podman[186976]: 2025-12-10 10:15:40.675375813 +0000 UTC m=+0.145391418 container start b084993083eae38031b275ffab605ddaafe09a4bf5c336bdf678bbef18bb87c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 10 10:15:40 compute-0 podman[186976]: nova_compute
Dec 10 10:15:40 compute-0 nova_compute[186989]: + sudo -E kolla_set_configs
Dec 10 10:15:40 compute-0 systemd[1]: Started nova_compute container.
Dec 10 10:15:40 compute-0 sudo[186926]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Validating config file
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying service configuration files
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /etc/ceph
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Creating directory /etc/ceph
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /etc/ceph
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Writing out command to execute
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:40 compute-0 nova_compute[186989]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 10 10:15:40 compute-0 nova_compute[186989]: ++ cat /run_command
Dec 10 10:15:40 compute-0 nova_compute[186989]: + CMD=nova-compute
Dec 10 10:15:40 compute-0 nova_compute[186989]: + ARGS=
Dec 10 10:15:40 compute-0 nova_compute[186989]: + sudo kolla_copy_cacerts
Dec 10 10:15:40 compute-0 nova_compute[186989]: + [[ ! -n '' ]]
Dec 10 10:15:40 compute-0 nova_compute[186989]: + . kolla_extend_start
Dec 10 10:15:40 compute-0 nova_compute[186989]: Running command: 'nova-compute'
Dec 10 10:15:40 compute-0 nova_compute[186989]: + echo 'Running command: '\''nova-compute'\'''
Dec 10 10:15:40 compute-0 nova_compute[186989]: + umask 0022
Dec 10 10:15:40 compute-0 nova_compute[186989]: + exec nova-compute
Dec 10 10:15:41 compute-0 sudo[187150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crskpxnyjjtzsdywpwcfbaoxhuonwzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361740.925494-1546-116269384485667/AnsiballZ_podman_container.py'
Dec 10 10:15:41 compute-0 sudo[187150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:41 compute-0 python3.9[187152]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 10 10:15:41 compute-0 systemd[1]: Started libpod-conmon-f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7.scope.
Dec 10 10:15:41 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d3d2c5129783595e3b1f4846e4a7b147fd27fa9173de8aac785414887f13e7b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d3d2c5129783595e3b1f4846e4a7b147fd27fa9173de8aac785414887f13e7b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d3d2c5129783595e3b1f4846e4a7b147fd27fa9173de8aac785414887f13e7b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 10 10:15:41 compute-0 podman[187178]: 2025-12-10 10:15:41.71032273 +0000 UTC m=+0.138493912 container init f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3)
Dec 10 10:15:41 compute-0 podman[187178]: 2025-12-10 10:15:41.718494129 +0000 UTC m=+0.146665291 container start f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute_init, io.buildah.version=1.41.3)
Dec 10 10:15:41 compute-0 python3.9[187152]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Applying nova statedir ownership
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 10 10:15:41 compute-0 nova_compute_init[187199]: INFO:nova_statedir:Nova statedir ownership complete
Dec 10 10:15:41 compute-0 systemd[1]: libpod-f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7.scope: Deactivated successfully.
Dec 10 10:15:41 compute-0 podman[187200]: 2025-12-10 10:15:41.778469801 +0000 UTC m=+0.030884862 container died f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 10 10:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7-userdata-shm.mount: Deactivated successfully.
Dec 10 10:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d3d2c5129783595e3b1f4846e4a7b147fd27fa9173de8aac785414887f13e7b-merged.mount: Deactivated successfully.
Dec 10 10:15:41 compute-0 podman[187211]: 2025-12-10 10:15:41.853800104 +0000 UTC m=+0.067337220 container cleanup f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:15:41 compute-0 sudo[187150]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:41 compute-0 systemd[1]: libpod-conmon-f476b6a15ae20982cd7462bdd9b223673c4cfbfd6c37c5e00d20a34b79b70cd7.scope: Deactivated successfully.
Dec 10 10:15:42 compute-0 sshd-session[158920]: Connection closed by 192.168.122.30 port 51168
Dec 10 10:15:42 compute-0 sshd-session[158917]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:15:42 compute-0 systemd-logind[787]: Session 24 logged out. Waiting for processes to exit.
Dec 10 10:15:42 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Dec 10 10:15:42 compute-0 systemd[1]: session-24.scope: Consumed 1min 58.029s CPU time.
Dec 10 10:15:42 compute-0 systemd-logind[787]: Removed session 24.
Dec 10 10:15:42 compute-0 podman[187263]: 2025-12-10 10:15:42.464118182 +0000 UTC m=+0.096071471 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.753 186993 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.753 186993 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.753 186993 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.754 186993 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.888 186993 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.911 186993 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:15:42 compute-0 nova_compute[186989]: 2025-12-10 10:15:42.912 186993 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 10 10:15:43 compute-0 podman[187294]: 2025-12-10 10:15:43.083661429 +0000 UTC m=+0.114452987 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.419 186993 INFO nova.virt.driver [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.515 186993 INFO nova.compute.provider_config [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.527 186993 DEBUG oslo_concurrency.lockutils [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.528 186993 DEBUG oslo_concurrency.lockutils [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.528 186993 DEBUG oslo_concurrency.lockutils [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.528 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.528 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.529 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.529 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.529 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.529 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.529 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.530 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.530 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.530 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.530 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.530 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.531 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.531 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.531 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.531 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.531 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.532 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.532 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.532 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.532 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.532 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.533 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.533 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.533 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.533 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.533 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.534 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.534 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.534 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.534 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.534 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.535 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.535 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.535 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.535 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.535 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.536 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.536 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.536 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.536 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.536 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.537 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.537 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.537 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.537 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.537 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.538 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.538 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.538 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.538 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.538 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.539 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.539 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.539 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.539 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.539 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.540 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.540 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.540 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.540 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.540 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.541 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.542 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.542 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.542 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.542 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.542 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.543 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.543 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.543 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.543 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.543 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.544 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.544 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.544 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.544 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.544 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.545 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.545 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.545 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.545 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.545 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.546 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.546 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.546 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.546 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.546 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.547 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.548 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.548 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.548 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.548 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.548 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.549 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.549 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.549 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.549 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.549 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.550 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.550 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.550 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.550 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.550 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.551 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.551 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.551 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.551 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.551 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.552 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.553 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.553 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.553 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.553 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.553 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.554 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.554 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.554 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.554 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.554 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.555 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.556 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.556 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.556 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.556 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.556 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.557 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.557 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.557 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.557 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.558 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.558 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.558 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.558 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.558 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.559 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.559 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.559 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.559 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.559 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.560 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.561 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.561 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.561 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.561 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.561 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.562 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.562 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.562 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.562 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.562 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.563 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.563 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.563 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.563 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.564 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.564 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.564 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.564 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.564 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.565 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.565 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.565 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.565 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.565 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.566 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.566 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.566 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.566 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.566 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.567 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.567 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.567 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.567 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.567 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.568 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.568 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.568 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.568 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.568 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.569 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.569 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.569 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.569 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.569 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.570 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.571 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.571 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.571 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.571 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.571 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.572 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.572 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.572 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.572 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.572 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.573 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.573 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.573 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.573 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.573 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.574 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.574 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.574 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.574 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.574 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.575 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.575 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.575 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.575 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.575 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.576 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.576 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.576 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.576 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.576 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.577 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.577 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.577 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.577 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.577 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.578 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.579 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.579 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.579 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.579 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.579 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.580 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.580 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.580 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.580 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.580 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.581 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.581 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.581 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.581 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.581 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.582 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.582 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.582 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.582 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.582 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.583 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.583 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.583 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.583 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.583 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.584 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.584 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.584 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.584 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.584 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.585 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.585 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.585 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.585 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.585 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.586 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.586 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.586 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.586 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.586 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.587 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.588 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.588 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.588 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.588 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.588 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.589 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.589 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.589 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.589 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.589 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.590 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.590 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.590 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.590 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.590 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.591 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.591 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.591 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.591 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.591 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.592 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.592 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.592 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.592 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.592 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.593 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.593 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.593 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.593 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.593 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.594 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.594 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.594 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.594 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.594 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.595 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.596 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.596 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.596 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.596 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.596 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.597 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.597 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.597 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.597 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.597 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.598 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.598 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.598 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.598 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.599 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.599 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.599 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.599 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.599 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.600 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.600 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.600 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.600 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.600 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.601 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.601 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.601 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.601 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.601 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.602 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.602 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.602 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.602 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.602 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.603 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.604 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.604 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.604 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.604 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.604 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.605 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.605 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.605 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.605 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.605 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.606 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.606 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.606 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.606 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.606 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.607 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.607 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.607 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.607 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.607 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.608 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.608 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.608 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.608 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.608 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.609 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.609 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.609 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.609 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.609 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.610 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.610 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.610 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.610 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.610 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.611 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.611 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.611 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.611 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.611 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.612 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.612 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.612 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.612 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.613 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.613 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.613 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.613 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.613 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.614 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.614 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.614 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.614 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.614 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.615 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.616 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.616 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.616 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.616 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.616 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.617 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.617 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.617 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.617 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.617 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.618 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.618 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.618 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.618 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.618 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.619 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.619 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.619 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.619 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.619 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.620 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.620 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.620 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.620 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.620 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.621 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.621 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.621 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.621 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.621 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.622 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.622 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.622 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.622 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.622 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.623 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.623 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.623 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.623 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.623 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.624 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.624 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.624 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.624 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.624 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.625 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.625 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.625 186993 WARNING oslo_config.cfg [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 10 10:15:43 compute-0 nova_compute[186989]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 10 10:15:43 compute-0 nova_compute[186989]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 10 10:15:43 compute-0 nova_compute[186989]: and ``live_migration_inbound_addr`` respectively.
Dec 10 10:15:43 compute-0 nova_compute[186989]: ).  Its value may be silently ignored in the future.
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.625 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.626 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.626 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.626 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.626 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.627 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.628 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.628 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.628 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.628 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.629 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.629 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.629 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.629 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.629 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.630 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.631 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.631 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.631 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.631 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.631 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.632 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.632 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.632 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.632 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.633 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.633 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.633 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.633 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.633 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.634 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.635 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.635 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.635 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.635 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.636 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.637 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.637 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.637 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.637 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.637 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.638 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.638 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.638 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.638 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.638 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.639 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.639 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.639 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.639 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.639 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.640 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.641 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.641 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.641 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.641 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.641 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.642 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.642 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.642 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.642 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.642 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.643 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.643 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.643 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.643 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.643 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.644 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.644 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.644 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.644 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.644 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.645 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.645 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.645 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.645 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.645 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.646 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.646 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.646 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.646 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.646 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.647 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.647 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.647 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.647 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.647 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.648 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.649 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.649 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.649 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.649 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.649 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.650 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.650 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.650 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.650 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.650 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.651 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.651 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.651 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.651 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.651 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.652 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.652 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.652 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.652 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.652 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.653 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.653 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.653 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.653 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.654 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.655 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.655 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.655 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.655 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.655 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.656 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.656 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.656 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.656 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.656 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.657 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.657 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.657 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.657 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.657 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.658 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.658 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.658 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.658 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.658 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.659 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.659 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.659 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.659 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.659 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.660 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.660 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.660 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.660 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.660 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.661 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.661 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.661 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.661 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.661 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.662 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.662 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.662 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.662 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.662 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.663 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.663 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.663 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.663 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.663 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.664 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.664 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.664 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.664 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.664 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.665 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.665 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.665 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.665 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.665 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.666 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.666 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.666 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.666 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.666 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.667 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.667 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.667 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.667 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.667 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.668 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.668 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.668 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.668 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.668 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.669 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.669 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.669 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.669 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.669 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.670 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.671 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.671 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.671 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.671 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.671 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.672 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.672 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.672 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.672 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.672 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.673 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.673 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.673 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.673 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.673 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.674 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.674 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.674 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.674 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.674 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.675 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.675 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.675 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.675 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.675 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.676 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.676 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.676 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.676 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.676 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.677 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.677 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.677 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.677 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.677 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.678 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.678 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.678 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.678 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.678 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.679 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.679 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.679 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.679 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.679 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.680 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.680 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.680 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.680 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.680 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.681 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.681 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.681 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.681 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.681 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.682 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.682 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.682 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.682 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.682 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.683 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.683 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.683 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.683 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.683 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.684 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.684 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.684 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.684 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.684 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.685 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.685 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.685 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.685 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.685 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.686 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.686 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.686 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.686 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.686 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.687 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.687 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.687 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.687 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.687 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.688 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.689 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.689 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.689 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.689 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.689 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.690 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.690 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.690 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.690 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.690 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.691 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.691 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.691 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.691 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.691 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.692 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.692 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.692 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.692 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.692 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.693 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.693 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.693 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.693 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.693 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.694 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.694 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.694 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.694 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.694 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.695 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.695 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.695 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.695 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.695 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.696 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.697 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.697 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.697 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.697 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.697 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.698 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.698 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.698 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.698 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.698 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.699 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.699 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.699 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.699 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.699 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.700 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.701 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.701 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.701 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.701 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.701 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.702 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.702 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.702 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.702 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.702 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.703 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.703 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.703 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.703 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.703 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.704 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.704 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.704 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.704 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.704 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.705 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.705 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.705 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.705 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.705 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.706 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.706 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.706 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.706 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.706 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.707 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.708 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.708 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.708 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.708 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.708 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.709 186993 DEBUG oslo_service.service [None req-2d81231c-fdf4-488e-9319-3750f57e1922 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.710 186993 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.748 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.749 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.749 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.749 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.763 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb7a8d63d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.765 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb7a8d63d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.766 186993 INFO nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Connection event '1' reason 'None'
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.774 186993 INFO nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Libvirt host capabilities <capabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]: 
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <host>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <uuid>4f9a932e-d23a-4638-b69b-16fdca20f7f4</uuid>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <arch>x86_64</arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model>EPYC-Rome-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <vendor>AMD</vendor>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <microcode version='16777317'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <signature family='23' model='49' stepping='0'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='x2apic'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='tsc-deadline'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='osxsave'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='hypervisor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='tsc_adjust'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='spec-ctrl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='stibp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='arch-capabilities'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='cmp_legacy'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='topoext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='virt-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='lbrv'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='tsc-scale'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='vmcb-clean'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='pause-filter'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='pfthreshold'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='svme-addr-chk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='rdctl-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='skip-l1dfl-vmentry'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='mds-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature name='pschange-mc-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <pages unit='KiB' size='4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <pages unit='KiB' size='2048'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <pages unit='KiB' size='1048576'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <power_management>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <suspend_mem/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <suspend_disk/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <suspend_hybrid/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </power_management>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <iommu support='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <migration_features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <live/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <uri_transports>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <uri_transport>tcp</uri_transport>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <uri_transport>rdma</uri_transport>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </uri_transports>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </migration_features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <topology>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <cells num='1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <cell id='0'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <memory unit='KiB'>7864300</memory>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <pages unit='KiB' size='4'>1966075</pages>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <pages unit='KiB' size='2048'>0</pages>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <distances>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <sibling id='0' value='10'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           </distances>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           <cpus num='8'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:           </cpus>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         </cell>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </cells>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </topology>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <cache>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </cache>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <secmodel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model>selinux</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <doi>0</doi>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </secmodel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <secmodel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model>dac</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <doi>0</doi>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </secmodel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </host>
Dec 10 10:15:43 compute-0 nova_compute[186989]: 
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <guest>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <os_type>hvm</os_type>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <arch name='i686'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <wordsize>32</wordsize>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <domain type='qemu'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <domain type='kvm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <pae/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <nonpae/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <acpi default='on' toggle='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <apic default='on' toggle='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <cpuselection/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <deviceboot/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <disksnapshot default='on' toggle='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <externalSnapshot/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </guest>
Dec 10 10:15:43 compute-0 nova_compute[186989]: 
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <guest>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <os_type>hvm</os_type>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <arch name='x86_64'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <wordsize>64</wordsize>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <domain type='qemu'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <domain type='kvm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <acpi default='on' toggle='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <apic default='on' toggle='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <cpuselection/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <deviceboot/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <disksnapshot default='on' toggle='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <externalSnapshot/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </guest>
Dec 10 10:15:43 compute-0 nova_compute[186989]: 
Dec 10 10:15:43 compute-0 nova_compute[186989]: </capabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]: 
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.782 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.786 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 10 10:15:43 compute-0 nova_compute[186989]: <domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <domain>kvm</domain>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <arch>i686</arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <vcpu max='4096'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <iothreads supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <os supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='firmware'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <loader supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>rom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pflash</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='readonly'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>yes</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='secure'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </loader>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </os>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='maximumMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <vendor>AMD</vendor>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='succor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='custom' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-128'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-256'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-512'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <memoryBacking supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='sourceType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>anonymous</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>memfd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </memoryBacking>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <disk supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='diskDevice'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>disk</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cdrom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>floppy</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>lun</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>fdc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>sata</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <graphics supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vnc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egl-headless</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <video supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='modelType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vga</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cirrus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>none</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>bochs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ramfb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </video>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hostdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='mode'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>subsystem</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='startupPolicy'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>mandatory</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>requisite</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>optional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='subsysType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pci</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='capsType'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='pciBackend'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hostdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <rng supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>random</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <filesystem supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='driverType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>path</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>handle</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtiofs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </filesystem>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <tpm supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-tis</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-crb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emulator</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>external</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendVersion'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>2.0</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </tpm>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <redirdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </redirdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <channel supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </channel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <crypto supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </crypto>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <interface supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>passt</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <panic supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>isa</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>hyperv</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </panic>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <console supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>null</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dev</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pipe</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stdio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>udp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tcp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu-vdagent</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </console>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <gic supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <genid supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backup supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <async-teardown supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <ps2 supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sev supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sgx supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hyperv supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='features'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>relaxed</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vapic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>spinlocks</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vpindex</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>runtime</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>synic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stimer</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reset</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vendor_id</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>frequencies</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reenlightenment</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tlbflush</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ipi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>avic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emsr_bitmap</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>xmm_input</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hyperv>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <launchSecurity supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='sectype'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tdx</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </launchSecurity>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </features>
Dec 10 10:15:43 compute-0 nova_compute[186989]: </domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.793 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 10 10:15:43 compute-0 nova_compute[186989]: <domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <domain>kvm</domain>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <arch>i686</arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <vcpu max='240'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <iothreads supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <os supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='firmware'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <loader supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>rom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pflash</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='readonly'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>yes</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='secure'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </loader>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </os>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='maximumMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <vendor>AMD</vendor>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='succor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='custom' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-128'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-256'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-512'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <memoryBacking supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='sourceType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>anonymous</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>memfd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </memoryBacking>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <disk supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='diskDevice'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>disk</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cdrom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>floppy</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>lun</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ide</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>fdc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>sata</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <graphics supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vnc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egl-headless</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <video supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='modelType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vga</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cirrus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>none</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>bochs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ramfb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </video>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hostdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='mode'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>subsystem</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='startupPolicy'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>mandatory</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>requisite</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>optional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='subsysType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pci</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='capsType'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='pciBackend'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hostdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <rng supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>random</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <filesystem supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='driverType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>path</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>handle</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtiofs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </filesystem>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <tpm supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-tis</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-crb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emulator</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>external</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendVersion'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>2.0</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </tpm>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <redirdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </redirdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <channel supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </channel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <crypto supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </crypto>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <interface supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>passt</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <panic supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>isa</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>hyperv</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </panic>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <console supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>null</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dev</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pipe</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stdio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>udp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tcp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu-vdagent</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </console>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <gic supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <genid supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backup supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <async-teardown supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <ps2 supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sev supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sgx supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hyperv supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='features'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>relaxed</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vapic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>spinlocks</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vpindex</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>runtime</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>synic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stimer</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reset</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vendor_id</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>frequencies</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reenlightenment</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tlbflush</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ipi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>avic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emsr_bitmap</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>xmm_input</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hyperv>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <launchSecurity supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='sectype'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tdx</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </launchSecurity>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </features>
Dec 10 10:15:43 compute-0 nova_compute[186989]: </domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.842 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.845 186993 WARNING nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.845 186993 DEBUG nova.virt.libvirt.volume.mount [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.850 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 10 10:15:43 compute-0 nova_compute[186989]: <domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <domain>kvm</domain>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <arch>x86_64</arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <vcpu max='240'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <iothreads supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <os supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='firmware'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <loader supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>rom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pflash</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='readonly'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>yes</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='secure'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </loader>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </os>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='maximumMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <vendor>AMD</vendor>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='succor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='custom' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Denverton-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='EPYC-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-128'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-256'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx10-512'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Haswell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='athlon-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='core2duo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='coreduo-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='n270-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='phenom-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <memoryBacking supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='sourceType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>anonymous</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>memfd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </memoryBacking>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <disk supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='diskDevice'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>disk</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cdrom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>floppy</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>lun</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ide</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>fdc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>sata</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <graphics supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vnc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egl-headless</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <video supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='modelType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vga</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>cirrus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>none</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>bochs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ramfb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </video>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hostdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='mode'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>subsystem</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='startupPolicy'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>mandatory</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>requisite</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>optional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='subsysType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pci</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='capsType'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='pciBackend'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hostdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <rng supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>random</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>egd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <filesystem supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='driverType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>path</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>handle</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>virtiofs</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </filesystem>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <tpm supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-tis</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tpm-crb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emulator</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>external</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendVersion'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>2.0</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </tpm>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <redirdev supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </redirdev>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <channel supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </channel>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <crypto supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </crypto>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <interface supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='backendType'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>passt</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <panic supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>isa</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>hyperv</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </panic>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <console supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>null</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vc</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dev</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>file</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pipe</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stdio</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>udp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tcp</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>qemu-vdagent</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </console>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <features>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <gic supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <genid supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <backup supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <async-teardown supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <ps2 supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sev supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <sgx supported='no'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <hyperv supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='features'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>relaxed</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vapic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>spinlocks</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vpindex</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>runtime</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>synic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>stimer</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reset</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>vendor_id</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>frequencies</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>reenlightenment</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tlbflush</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>ipi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>avic</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>emsr_bitmap</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>xmm_input</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </defaults>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </hyperv>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <launchSecurity supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='sectype'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>tdx</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </launchSecurity>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </features>
Dec 10 10:15:43 compute-0 nova_compute[186989]: </domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:43 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.914 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 10 10:15:43 compute-0 nova_compute[186989]: <domainCapabilities>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <path>/usr/libexec/qemu-kvm</path>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <domain>kvm</domain>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <arch>x86_64</arch>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <vcpu max='4096'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <iothreads supported='yes'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <os supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <enum name='firmware'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>efi</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <loader supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>rom</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>pflash</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='readonly'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>yes</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='secure'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>yes</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>no</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </loader>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   </os>
Dec 10 10:15:43 compute-0 nova_compute[186989]:   <cpu>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-passthrough' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='hostPassthroughMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='maximum' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <enum name='maximumMigratable'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>on</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <value>off</value>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='host-model' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <vendor>AMD</vendor>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='x2apic'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-deadline'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='hypervisor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc_adjust'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='spec-ctrl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='stibp'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='cmp_legacy'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='overflow-recov'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='succor'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='ibrs'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='amd-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='virt-ssbd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lbrv'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='tsc-scale'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='vmcb-clean'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='flushbyasid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pause-filter'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='pfthreshold'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='svme-addr-chk'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <feature policy='disable' name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:43 compute-0 nova_compute[186989]:     <mode name='custom' supported='yes'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Broadwell-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v1'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v2'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v3'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v4'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cascadelake-Server-v5'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 10 10:15:43 compute-0 nova_compute[186989]:       <blockers model='Cooperlake'>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:43 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Cooperlake-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Denverton'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Denverton-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Denverton-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Denverton-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Dhyana-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Genoa-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='auto-ibrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Milan-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amd-psfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='no-nested-data-bp'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='null-sel-clr-base'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='stibp-always-on'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-Rome-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='EPYC-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='GraniteRapids-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx10'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx10-128'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx10-256'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx10-512'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='prefetchiti'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Haswell-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-noTSX'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v5'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v6'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Icelake-Server-v7'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='IvyBridge'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='IvyBridge-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='KnightsMill'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='KnightsMill-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-4fmaps'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-4vnniw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512er'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512pf'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Opteron_G4-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Opteron_G5-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fma4'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tbm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xop'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SapphireRapids-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='amx-tile'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-bf16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-fp16'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512-vpopcntdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bitalg'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vbmi2'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrc'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fzrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='la57'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='taa-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='tsx-ldtrk'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xfd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SierraForest'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='SierraForest-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-ifma'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-ne-convert'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx-vnni-int8'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='bus-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cmpccxadd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fbsdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='fsrs'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ibrs-all'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mcdt-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pbrsb-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='psdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='sbdr-ssdp-no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='serialize'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vaes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='vpclmulqdq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Client-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='hle'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='rtm'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Skylake-Server-v5'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512bw'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512cd'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512dq'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512f'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='avx512vl'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='invpcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pcid'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='pku'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Snowridge'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='mpx'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v2'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v3'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='core-capability'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='split-lock-detect'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='Snowridge-v4'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='cldemote'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='erms'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='gfni'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdir64b'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='movdiri'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='xsaves'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='athlon'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='athlon-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='core2duo'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='core2duo-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='coreduo'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='coreduo-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='n270'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='n270-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='ss'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='phenom'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <blockers model='phenom-v1'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnow'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <feature name='3dnowext'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </blockers>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </mode>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   <memoryBacking supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <enum name='sourceType'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <value>file</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <value>anonymous</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <value>memfd</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   </memoryBacking>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <disk supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='diskDevice'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>disk</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>cdrom</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>floppy</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>lun</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>fdc</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>sata</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <graphics supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vnc</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>egl-headless</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <video supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='modelType'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vga</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>cirrus</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>none</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>bochs</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>ramfb</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </video>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <hostdev supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='mode'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>subsystem</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='startupPolicy'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>mandatory</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>requisite</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>optional</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='subsysType'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>pci</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>scsi</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='capsType'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='pciBackend'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </hostdev>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <rng supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio-transitional</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtio-non-transitional</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>random</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>egd</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <filesystem supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='driverType'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>path</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>handle</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>virtiofs</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </filesystem>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <tpm supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>tpm-tis</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>tpm-crb</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>emulator</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>external</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='backendVersion'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>2.0</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </tpm>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <redirdev supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='bus'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>usb</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </redirdev>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <channel supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </channel>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <crypto supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='model'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>qemu</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='backendModel'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>builtin</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </crypto>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <interface supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='backendType'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>default</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>passt</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <panic supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='model'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>isa</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>hyperv</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </panic>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <console supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='type'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>null</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vc</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>pty</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>dev</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>file</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>pipe</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>stdio</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>udp</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>tcp</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>unix</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>qemu-vdagent</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>dbus</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </console>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   <features>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <gic supported='no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <vmcoreinfo supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <genid supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <backingStoreInput supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <backup supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <async-teardown supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <ps2 supported='yes'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <sev supported='no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <sgx supported='no'/>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <hyperv supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='features'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>relaxed</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vapic</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>spinlocks</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vpindex</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>runtime</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>synic</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>stimer</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>reset</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>vendor_id</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>frequencies</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>reenlightenment</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>tlbflush</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>ipi</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>avic</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>emsr_bitmap</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>xmm_input</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <defaults>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <spinlocks>4095</spinlocks>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <stimer_direct>on</stimer_direct>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <tlbflush_direct>on</tlbflush_direct>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <tlbflush_extended>on</tlbflush_extended>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </defaults>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </hyperv>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     <launchSecurity supported='yes'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       <enum name='sectype'>
Dec 10 10:15:44 compute-0 nova_compute[186989]:         <value>tdx</value>
Dec 10 10:15:44 compute-0 nova_compute[186989]:       </enum>
Dec 10 10:15:44 compute-0 nova_compute[186989]:     </launchSecurity>
Dec 10 10:15:44 compute-0 nova_compute[186989]:   </features>
Dec 10 10:15:44 compute-0 nova_compute[186989]: </domainCapabilities>
Dec 10 10:15:44 compute-0 nova_compute[186989]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.980 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.980 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.980 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.981 186993 INFO nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Secure Boot support detected
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.983 186993 INFO nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.983 186993 INFO nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:43.995 186993 DEBUG nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.041 186993 INFO nova.virt.node [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Determined node identity 94de3f96-a911-486c-b08b-8a5da489baa6 from /var/lib/nova/compute_id
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.178 186993 WARNING nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Compute nodes ['94de3f96-a911-486c-b08b-8a5da489baa6'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.297 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.362 186993 WARNING nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.362 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.362 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.363 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.363 186993 DEBUG nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:15:44 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 10 10:15:44 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.642 186993 WARNING nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.643 186993 DEBUG nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6172MB free_disk=73.53335952758789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.643 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.643 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.656 186993 WARNING nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] No compute node record for compute-0.ctlplane.example.com:94de3f96-a911-486c-b08b-8a5da489baa6: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 94de3f96-a911-486c-b08b-8a5da489baa6 could not be found.
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.675 186993 INFO nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 94de3f96-a911-486c-b08b-8a5da489baa6
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.749 186993 DEBUG nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:15:44 compute-0 nova_compute[186989]: 2025-12-10 10:15:44.749 186993 DEBUG nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:15:45 compute-0 nova_compute[186989]: 2025-12-10 10:15:45.791 186993 INFO nova.scheduler.client.report [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [req-13394fe4-95e3-4eaf-9296-71e8341a1b4a] Created resource provider record via placement API for resource provider with UUID 94de3f96-a911-486c-b08b-8a5da489baa6 and name compute-0.ctlplane.example.com.
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.161 186993 DEBUG nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 10 10:15:46 compute-0 nova_compute[186989]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.161 186993 INFO nova.virt.libvirt.host [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] kernel doesn't support AMD SEV
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.162 186993 DEBUG nova.compute.provider_tree [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.162 186993 DEBUG nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.215 186993 DEBUG nova.scheduler.client.report [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Updated inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.215 186993 DEBUG nova.compute.provider_tree [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Updating resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.215 186993 DEBUG nova.compute.provider_tree [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.326 186993 DEBUG nova.compute.provider_tree [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Updating resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.364 186993 DEBUG nova.compute.resource_tracker [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.365 186993 DEBUG oslo_concurrency.lockutils [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.365 186993 DEBUG nova.service [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.470 186993 DEBUG nova.service [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 10 10:15:46 compute-0 nova_compute[186989]: 2025-12-10 10:15:46.471 186993 DEBUG nova.servicegroup.drivers.db [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 10 10:15:47 compute-0 sshd-session[187358]: Accepted publickey for zuul from 192.168.122.30 port 40772 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:15:47 compute-0 systemd-logind[787]: New session 26 of user zuul.
Dec 10 10:15:47 compute-0 systemd[1]: Started Session 26 of User zuul.
Dec 10 10:15:47 compute-0 sshd-session[187358]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:15:48 compute-0 python3.9[187511]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 10 10:15:50 compute-0 sudo[187665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozwmeascsltcfrwwntpgshurwjczvbyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361749.4405124-36-134540424155162/AnsiballZ_systemd_service.py'
Dec 10 10:15:50 compute-0 sudo[187665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:50 compute-0 python3.9[187667]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:15:50 compute-0 systemd[1]: Reloading.
Dec 10 10:15:50 compute-0 nova_compute[186989]: 2025-12-10 10:15:50.473 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:15:50 compute-0 systemd-sysv-generator[187699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:15:50 compute-0 nova_compute[186989]: 2025-12-10 10:15:50.508 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:15:50 compute-0 systemd-rc-local-generator[187693]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:15:50 compute-0 sudo[187665]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:51 compute-0 python3.9[187852]: ansible-ansible.builtin.service_facts Invoked
Dec 10 10:15:51 compute-0 network[187869]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 10 10:15:51 compute-0 network[187870]: 'network-scripts' will be removed from distribution in near future.
Dec 10 10:15:51 compute-0 network[187871]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 10 10:15:55 compute-0 sudo[188143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovmczsmeyijcknclouutztpavbkqmcsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361754.8815804-55-104509369802455/AnsiballZ_systemd_service.py'
Dec 10 10:15:55 compute-0 sudo[188143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:55 compute-0 python3.9[188145]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:15:55 compute-0 sudo[188143]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:56 compute-0 sudo[188296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxweymlwwzmggsfeyuohxggmagvcxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361755.7631185-65-158332413432652/AnsiballZ_file.py'
Dec 10 10:15:56 compute-0 sudo[188296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:56 compute-0 python3.9[188298]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:56 compute-0 sudo[188296]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:56 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:15:56 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:15:56 compute-0 sudo[188449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enfdgaxdcszxrczthofprpetabqttgyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361756.6009014-73-271051231016843/AnsiballZ_file.py'
Dec 10 10:15:56 compute-0 sudo[188449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:57 compute-0 python3.9[188451]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:15:57 compute-0 sudo[188449]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:57 compute-0 sudo[188601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdjyvolnqcwlorabyzicbtypylvgdjus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361757.3268604-82-248417576508095/AnsiballZ_command.py'
Dec 10 10:15:57 compute-0 sudo[188601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:57 compute-0 python3.9[188603]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:15:57 compute-0 sudo[188601]: pam_unix(sudo:session): session closed for user root
Dec 10 10:15:58 compute-0 python3.9[188755]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:15:59 compute-0 sudo[188905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpbruzqpgsdblkpupzuzzdsguosclqtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361758.949484-100-20228543484160/AnsiballZ_systemd_service.py'
Dec 10 10:15:59 compute-0 sudo[188905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:15:59 compute-0 python3.9[188907]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:15:59 compute-0 systemd[1]: Reloading.
Dec 10 10:15:59 compute-0 systemd-rc-local-generator[188934]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:15:59 compute-0 systemd-sysv-generator[188938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:15:59 compute-0 sudo[188905]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:00 compute-0 sudo[189092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcbcrxpupnrgsqqdzntjhzztqsignxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361759.9788766-108-199719808942129/AnsiballZ_command.py'
Dec 10 10:16:00 compute-0 sudo[189092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:00 compute-0 python3.9[189094]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:16:00 compute-0 sudo[189092]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:00 compute-0 sudo[189245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upljcijfazidqbfubxtrkqrnklfvguec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361760.6980908-117-186475782800141/AnsiballZ_file.py'
Dec 10 10:16:00 compute-0 sudo[189245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:01 compute-0 python3.9[189247]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:01 compute-0 sudo[189245]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:01 compute-0 python3.9[189397]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:16:02 compute-0 python3.9[189549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:03 compute-0 python3.9[189670]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361762.1735814-133-245707542466765/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:04 compute-0 sudo[189820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwdcvyvzzkqojvksazmechzmcrqmbvsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361763.5700758-148-230424949368846/AnsiballZ_group.py'
Dec 10 10:16:04 compute-0 sudo[189820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:04 compute-0 python3.9[189822]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 10 10:16:04 compute-0 sudo[189820]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:05 compute-0 sudo[189972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyhwwvvyhfphmehekkrwuktpdcnjoegx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361764.5457785-159-170713332278534/AnsiballZ_getent.py'
Dec 10 10:16:05 compute-0 sudo[189972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:05 compute-0 python3.9[189974]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 10 10:16:05 compute-0 sudo[189972]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:05 compute-0 sudo[190125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxykjlkwrzqfnxwgbhqqycqzesoinzde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361765.4106028-167-262454790236885/AnsiballZ_group.py'
Dec 10 10:16:05 compute-0 sudo[190125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:05 compute-0 python3.9[190127]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 10 10:16:05 compute-0 groupadd[190128]: group added to /etc/group: name=ceilometer, GID=42405
Dec 10 10:16:05 compute-0 groupadd[190128]: group added to /etc/gshadow: name=ceilometer
Dec 10 10:16:05 compute-0 groupadd[190128]: new group: name=ceilometer, GID=42405
Dec 10 10:16:06 compute-0 sudo[190125]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:06 compute-0 sudo[190283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrmlqtrpugwfpwdeowzkmszxsplotvyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361766.1902697-175-255165759961308/AnsiballZ_user.py'
Dec 10 10:16:06 compute-0 sudo[190283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:06 compute-0 python3.9[190285]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 10 10:16:06 compute-0 useradd[190288]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 10 10:16:06 compute-0 useradd[190288]: add 'ceilometer' to group 'libvirt'
Dec 10 10:16:06 compute-0 useradd[190288]: add 'ceilometer' to shadow group 'libvirt'
Dec 10 10:16:07 compute-0 podman[190286]: 2025-12-10 10:16:07.031703593 +0000 UTC m=+0.069092092 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:16:07 compute-0 sudo[190283]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:08 compute-0 python3.9[190462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:08 compute-0 python3.9[190583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765361767.6832118-201-230278819278019/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:09 compute-0 python3.9[190733]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:09 compute-0 python3.9[190854]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765361768.814055-201-91991380561604/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:10 compute-0 python3.9[191004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:10 compute-0 python3.9[191125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765361769.9255388-201-35953373631445/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:11 compute-0 python3.9[191275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:16:12 compute-0 python3.9[191427]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:16:12 compute-0 podman[191553]: 2025-12-10 10:16:12.735389353 +0000 UTC m=+0.099029706 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 10 10:16:12 compute-0 python3.9[191598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:13 compute-0 podman[191699]: 2025-12-10 10:16:13.329639131 +0000 UTC m=+0.101475503 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 10 10:16:13 compute-0 python3.9[191737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361772.335483-260-158799822649637/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:14 compute-0 python3.9[191893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:14 compute-0 python3.9[191969]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:15 compute-0 python3.9[192119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:15 compute-0 python3.9[192240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361774.5866518-260-199433491653470/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:16 compute-0 python3.9[192390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:16 compute-0 python3.9[192511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361775.7761564-260-218049995483459/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:17 compute-0 python3.9[192661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:17 compute-0 python3.9[192782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361776.9064631-260-14273067583532/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:18 compute-0 python3.9[192932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:19 compute-0 python3.9[193053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361778.0228577-260-92842312269482/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:19 compute-0 python3.9[193203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:20 compute-0 python3.9[193324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361779.2464912-260-10751008991276/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:20 compute-0 python3.9[193474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:21 compute-0 python3.9[193595]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361780.4627237-260-30871580487213/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:22 compute-0 python3.9[193745]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:22 compute-0 python3.9[193866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361781.7624657-260-115919708269203/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:23 compute-0 python3.9[194016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:23 compute-0 python3.9[194137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361782.8901255-260-272507397506679/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:24 compute-0 python3.9[194287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:25 compute-0 python3.9[194408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361784.0564723-260-177299781836323/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:25 compute-0 python3.9[194558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:26 compute-0 python3.9[194634]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:27 compute-0 python3.9[194784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:27 compute-0 python3.9[194860]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:28 compute-0 python3.9[195010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:28 compute-0 python3.9[195086]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:29 compute-0 sudo[195236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubtgljndvxhdsahrlcgjvpojsdiomzot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361788.7869005-449-127241094682240/AnsiballZ_file.py'
Dec 10 10:16:29 compute-0 sudo[195236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:29 compute-0 python3.9[195238]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:29 compute-0 sudo[195236]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:29 compute-0 sudo[195388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrfiigjzbadfzmbweasihzcdlwirnjns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361789.513036-457-255811164332201/AnsiballZ_file.py'
Dec 10 10:16:29 compute-0 sudo[195388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:29 compute-0 python3.9[195390]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:30 compute-0 sudo[195388]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:30 compute-0 sudo[195540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggqjvqmsyayhyodamyccdilgxjnmqrkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361790.1795628-465-71692298820981/AnsiballZ_file.py'
Dec 10 10:16:30 compute-0 sudo[195540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:30 compute-0 python3.9[195542]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:30 compute-0 sudo[195540]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:31 compute-0 sudo[195692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfupbsswxuuswrrsjympkzuykrqbaowt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361790.9246764-473-211659781518320/AnsiballZ_systemd_service.py'
Dec 10 10:16:31 compute-0 sudo[195692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:16:31.456 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:16:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:16:31.456 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:16:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:16:31.456 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:16:31 compute-0 python3.9[195694]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:16:31 compute-0 systemd[1]: Reloading.
Dec 10 10:16:31 compute-0 systemd-rc-local-generator[195719]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:16:31 compute-0 systemd-sysv-generator[195722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:16:31 compute-0 systemd[1]: Listening on Podman API Socket.
Dec 10 10:16:31 compute-0 sudo[195692]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:32 compute-0 sudo[195883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqsowvwlfxiziktnhgdawbnatwqejxpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/AnsiballZ_stat.py'
Dec 10 10:16:32 compute-0 sudo[195883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:32 compute-0 python3.9[195885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:32 compute-0 sudo[195883]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:33 compute-0 sudo[196006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emunpiklbgctyoduknewsrlcowpfdpba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/AnsiballZ_copy.py'
Dec 10 10:16:33 compute-0 sudo[196006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:33 compute-0 python3.9[196008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:33 compute-0 sudo[196006]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:33 compute-0 sudo[196082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vairhztyqxsqqlnlfqwrqskgmxeozpcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/AnsiballZ_stat.py'
Dec 10 10:16:33 compute-0 sudo[196082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:34 compute-0 python3.9[196084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:34 compute-0 sudo[196082]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:34 compute-0 sudo[196205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skhftfpnhajdzqwqopomfcksghbzgrvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/AnsiballZ_copy.py'
Dec 10 10:16:34 compute-0 sudo[196205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:34 compute-0 python3.9[196207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361792.2294557-482-266219813037599/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:34 compute-0 sudo[196205]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:35 compute-0 sudo[196357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eezmtyarrqlgybbkhqvcqhzxggqyuzzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361794.9565418-510-31326398236389/AnsiballZ_container_config_data.py'
Dec 10 10:16:35 compute-0 sudo[196357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:35 compute-0 python3.9[196359]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 10 10:16:35 compute-0 sudo[196357]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:36 compute-0 sudo[196509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxhpvjojanaybsdeontilvraunodiexs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361795.9085126-519-72888595970424/AnsiballZ_container_config_hash.py'
Dec 10 10:16:36 compute-0 sudo[196509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:36 compute-0 python3.9[196511]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:16:36 compute-0 sudo[196509]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:37 compute-0 sudo[196667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enzsjpggmezmsjuinwbygtgovszfrvla ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361796.9136608-529-40380675876988/AnsiballZ_edpm_container_manage.py'
Dec 10 10:16:37 compute-0 sudo[196667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:37 compute-0 podman[196635]: 2025-12-10 10:16:37.521697381 +0000 UTC m=+0.078700499 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:16:37 compute-0 python3[196673]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:16:38 compute-0 podman[196720]: 2025-12-10 10:16:38.065295354 +0000 UTC m=+0.051701483 container create 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 10 10:16:38 compute-0 podman[196720]: 2025-12-10 10:16:38.037950639 +0000 UTC m=+0.024356738 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 10 10:16:38 compute-0 python3[196673]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 10 10:16:38 compute-0 sudo[196667]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:38 compute-0 sudo[196908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsvfqyigvlzgmfezgjdbswtyzfrjmfci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361798.4544432-537-250859959734510/AnsiballZ_stat.py'
Dec 10 10:16:38 compute-0 sudo[196908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:38 compute-0 python3.9[196910]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:16:38 compute-0 sudo[196908]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:39 compute-0 sudo[197062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdgbwbtwafuhqjoqlxumpszqyxdlbkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361799.2589872-546-162125063331244/AnsiballZ_file.py'
Dec 10 10:16:39 compute-0 sudo[197062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:39 compute-0 python3.9[197064]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:39 compute-0 sudo[197062]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:40 compute-0 sudo[197213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsehscujwgrpxkmtwaaljduavqithqvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361799.9100142-546-242437389382360/AnsiballZ_copy.py'
Dec 10 10:16:40 compute-0 sudo[197213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:40 compute-0 python3.9[197215]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361799.9100142-546-242437389382360/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:40 compute-0 sudo[197213]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:41 compute-0 sudo[197289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-botgqwfgqdzzeqquawhykxntdogtstbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361799.9100142-546-242437389382360/AnsiballZ_systemd.py'
Dec 10 10:16:41 compute-0 sudo[197289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:41 compute-0 python3.9[197291]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:16:41 compute-0 systemd[1]: Reloading.
Dec 10 10:16:41 compute-0 systemd-sysv-generator[197321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:16:41 compute-0 systemd-rc-local-generator[197313]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:16:41 compute-0 sudo[197289]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:42 compute-0 sudo[197400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlfgjvlzqmzapmivwycanmsvmgscdmit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361799.9100142-546-242437389382360/AnsiballZ_systemd.py'
Dec 10 10:16:42 compute-0 sudo[197400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:42 compute-0 python3.9[197402]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:16:42 compute-0 systemd[1]: Reloading.
Dec 10 10:16:42 compute-0 systemd-rc-local-generator[197433]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:16:42 compute-0 systemd-sysv-generator[197436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:16:42 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 10 10:16:42 compute-0 podman[197441]: 2025-12-10 10:16:42.872858741 +0000 UTC m=+0.075322949 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Dec 10 10:16:42 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:16:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.
Dec 10 10:16:42 compute-0 podman[197444]: 2025-12-10 10:16:42.917284787 +0000 UTC m=+0.112996973 container init 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.924 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.925 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.925 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.925 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:16:42 compute-0 ceilometer_agent_compute[197479]: + sudo -E kolla_set_configs
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.942 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.942 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.942 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.943 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.943 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.943 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.944 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.944 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.944 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:16:42 compute-0 podman[197444]: 2025-12-10 10:16:42.948351413 +0000 UTC m=+0.144063549 container start 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute)
Dec 10 10:16:42 compute-0 ceilometer_agent_compute[197479]: sudo: unable to send audit message: Operation not permitted
Dec 10 10:16:42 compute-0 sudo[197490]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 10 10:16:42 compute-0 sudo[197490]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:16:42 compute-0 sudo[197490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 10 10:16:42 compute-0 podman[197444]: ceilometer_agent_compute
Dec 10 10:16:42 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.969 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.970 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.970 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:16:42 compute-0 nova_compute[186989]: 2025-12-10 10:16:42.970 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:16:42 compute-0 sudo[197400]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Validating config file
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Copying service configuration files
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 10 10:16:43 compute-0 podman[197491]: 2025-12-10 10:16:43.021605325 +0000 UTC m=+0.063157971 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: INFO:__main__:Writing out command to execute
Dec 10 10:16:43 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-16e3059ea6297774.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:16:43 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-16e3059ea6297774.service: Failed with result 'exit-code'.
Dec 10 10:16:43 compute-0 sudo[197490]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: ++ cat /run_command
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + ARGS=
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + sudo kolla_copy_cacerts
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: sudo: unable to send audit message: Operation not permitted
Dec 10 10:16:43 compute-0 sudo[197526]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 10 10:16:43 compute-0 sudo[197526]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:16:43 compute-0 sudo[197526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 10 10:16:43 compute-0 sudo[197526]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + [[ ! -n '' ]]
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + . kolla_extend_start
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + umask 0022
Dec 10 10:16:43 compute-0 ceilometer_agent_compute[197479]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.120 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.121 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6105MB free_disk=73.53241348266602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.121 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.122 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.180 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.180 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.203 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.217 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.219 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:16:43 compute-0 nova_compute[186989]: 2025-12-10 10:16:43.219 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:16:43 compute-0 podman[197639]: 2025-12-10 10:16:43.480051677 +0000 UTC m=+0.067539690 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:16:43 compute-0 sudo[197681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgleuzguehtlitiodgkyvhmuqnatnppc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361803.1587543-570-202312713487215/AnsiballZ_systemd.py'
Dec 10 10:16:43 compute-0 sudo[197681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:43 compute-0 python3.9[197686]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:16:43 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Dec 10 10:16:43 compute-0 systemd[1]: libpod-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.scope: Deactivated successfully.
Dec 10 10:16:43 compute-0 podman[197690]: 2025-12-10 10:16:43.891230145 +0000 UTC m=+0.051081386 container died 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 10 10:16:43 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-16e3059ea6297774.timer: Deactivated successfully.
Dec 10 10:16:43 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.
Dec 10 10:16:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-userdata-shm.mount: Deactivated successfully.
Dec 10 10:16:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e-merged.mount: Deactivated successfully.
Dec 10 10:16:43 compute-0 podman[197690]: 2025-12-10 10:16:43.940870911 +0000 UTC m=+0.100722152 container cleanup 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:16:43 compute-0 podman[197690]: ceilometer_agent_compute
Dec 10 10:16:44 compute-0 podman[197720]: ceilometer_agent_compute
Dec 10 10:16:44 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 10 10:16:44 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Dec 10 10:16:44 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 10 10:16:44 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c855f8285fd33a563c65aae161641b39e0111cbf4aaebe011143dfabb91a47e/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:44 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.
Dec 10 10:16:44 compute-0 podman[197733]: 2025-12-10 10:16:44.167328737 +0000 UTC m=+0.121455260 container init 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm)
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + sudo -E kolla_set_configs
Dec 10 10:16:44 compute-0 podman[197733]: 2025-12-10 10:16:44.194236362 +0000 UTC m=+0.148362865 container start 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:16:44 compute-0 podman[197733]: ceilometer_agent_compute
Dec 10 10:16:44 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: sudo: unable to send audit message: Operation not permitted
Dec 10 10:16:44 compute-0 sudo[197754]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 10 10:16:44 compute-0 sudo[197754]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:16:44 compute-0 sudo[197754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 10 10:16:44 compute-0 sudo[197681]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Validating config file
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Copying service configuration files
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: INFO:__main__:Writing out command to execute
Dec 10 10:16:44 compute-0 sudo[197754]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: ++ cat /run_command
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + ARGS=
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + sudo kolla_copy_cacerts
Dec 10 10:16:44 compute-0 podman[197755]: 2025-12-10 10:16:44.278987403 +0000 UTC m=+0.062575516 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute)
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: sudo: unable to send audit message: Operation not permitted
Dec 10 10:16:44 compute-0 sudo[197781]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 10 10:16:44 compute-0 sudo[197781]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 10 10:16:44 compute-0 sudo[197781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 10 10:16:44 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-1226dbc449d54e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:16:44 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-1226dbc449d54e1.service: Failed with result 'exit-code'.
Dec 10 10:16:44 compute-0 sudo[197781]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + [[ ! -n '' ]]
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + . kolla_extend_start
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + umask 0022
Dec 10 10:16:44 compute-0 ceilometer_agent_compute[197748]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 10 10:16:44 compute-0 sudo[197930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejpexkcszqcholomnacupvarjbphfwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361804.4201176-578-251532720852046/AnsiballZ_stat.py'
Dec 10 10:16:44 compute-0 sudo[197930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:44 compute-0 python3.9[197932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:44 compute-0 sudo[197930]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.122 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.122 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.122 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.123 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.124 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.125 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.126 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.127 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.128 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.129 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.130 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.131 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.132 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.133 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.134 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.135 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.136 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.137 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.138 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.138 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.138 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.138 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.138 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.156 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.157 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.158 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.252 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 10 10:16:45 compute-0 sudo[198056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchmmeuvbwtzaxqxerpazhpgzuukdcpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361804.4201176-578-251532720852046/AnsiballZ_copy.py'
Dec 10 10:16:45 compute-0 sudo[198056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.368 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.369 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.369 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.369 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.369 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.369 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.370 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.370 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.370 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.370 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.371 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.371 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.371 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.371 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.372 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.372 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.372 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.372 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.372 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.373 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.373 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.373 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.373 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.374 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.374 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.374 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.374 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.374 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.375 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.375 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.375 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.375 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.375 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.376 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.377 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.377 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.377 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.377 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.378 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.379 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.379 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.379 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.379 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.379 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.380 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.380 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.380 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.380 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.380 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.381 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.381 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.381 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.381 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.381 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.382 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.382 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.382 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.382 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.382 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.383 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.383 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.383 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.383 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.383 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.384 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.384 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.384 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.384 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.384 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.385 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.385 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.385 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.385 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.385 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.386 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.386 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.386 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.386 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.386 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.387 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.387 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.387 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.387 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.387 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.388 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.388 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.388 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.388 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.388 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.389 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.389 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.389 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.389 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.390 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.390 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.390 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.390 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.390 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.391 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.391 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.391 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.391 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.391 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.392 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.392 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.392 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.392 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.392 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.393 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.393 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.393 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.393 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.393 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.394 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.394 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.394 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.394 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.394 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.395 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.396 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.397 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.398 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.399 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.400 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.400 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.400 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.400 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.400 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.401 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.401 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.401 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.401 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.401 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.402 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.403 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.403 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.403 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.403 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.403 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.404 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.404 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.404 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.404 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.404 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.405 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.406 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.406 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.406 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.406 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.406 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.407 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.407 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.407 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.407 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.407 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.408 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.408 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.408 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.408 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.408 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.409 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.409 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.409 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.409 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.409 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.410 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.410 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.410 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.410 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.410 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.411 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.412 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.412 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.414 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.420 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:16:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:16:45 compute-0 python3.9[198058]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361804.4201176-578-251532720852046/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:45 compute-0 sudo[198056]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:46 compute-0 sudo[198211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxjuagfbbhloxladegzndtkbvvzomlut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361805.913424-595-208226944040027/AnsiballZ_container_config_data.py'
Dec 10 10:16:46 compute-0 sudo[198211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:46 compute-0 python3.9[198213]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 10 10:16:46 compute-0 sudo[198211]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:46 compute-0 sudo[198363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjujkmeutjindqwqmklkrwljishdlhzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361806.6292784-604-247110699187572/AnsiballZ_container_config_hash.py'
Dec 10 10:16:46 compute-0 sudo[198363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:47 compute-0 python3.9[198365]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:16:47 compute-0 sudo[198363]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:47 compute-0 sudo[198515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgtpokcwvogwerddmzmfuuedghrnfqsl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361807.4701967-614-233949367724480/AnsiballZ_edpm_container_manage.py'
Dec 10 10:16:47 compute-0 sudo[198515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:48 compute-0 python3[198517]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:16:48 compute-0 podman[198552]: 2025-12-10 10:16:48.291441887 +0000 UTC m=+0.056261925 container create 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Dec 10 10:16:48 compute-0 podman[198552]: 2025-12-10 10:16:48.258147881 +0000 UTC m=+0.022967899 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 10 10:16:48 compute-0 python3[198517]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 10 10:16:48 compute-0 sudo[198515]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:49 compute-0 sudo[198740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwjpbuyheiynkdulvdksdzaxwjzmxwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361808.7292025-622-75411444413053/AnsiballZ_stat.py'
Dec 10 10:16:49 compute-0 sudo[198740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:49 compute-0 python3.9[198742]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:16:49 compute-0 sudo[198740]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:49 compute-0 sudo[198894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahhoqrsgjcipegslnolqvroebhlfyikx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361809.5187044-631-51667347352666/AnsiballZ_file.py'
Dec 10 10:16:49 compute-0 sudo[198894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:50 compute-0 python3.9[198896]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:50 compute-0 sudo[198894]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:50 compute-0 sudo[199045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmxaceumjzuytuqloevaswcoutniswa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361810.115259-631-50633582085301/AnsiballZ_copy.py'
Dec 10 10:16:50 compute-0 sudo[199045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:50 compute-0 python3.9[199047]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361810.115259-631-50633582085301/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:16:50 compute-0 sudo[199045]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:51 compute-0 sudo[199121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikbxmudjvwiklcgnagxktmdgjcyhaqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361810.115259-631-50633582085301/AnsiballZ_systemd.py'
Dec 10 10:16:51 compute-0 sudo[199121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:51 compute-0 python3.9[199123]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:16:51 compute-0 systemd[1]: Reloading.
Dec 10 10:16:51 compute-0 systemd-sysv-generator[199153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:16:51 compute-0 systemd-rc-local-generator[199150]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:16:51 compute-0 sudo[199121]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:52 compute-0 sudo[199231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqrsgviwjtlnozqrtvhsfwqgafkszwzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361810.115259-631-50633582085301/AnsiballZ_systemd.py'
Dec 10 10:16:52 compute-0 sudo[199231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:52 compute-0 python3.9[199233]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:16:52 compute-0 systemd[1]: Reloading.
Dec 10 10:16:52 compute-0 systemd-sysv-generator[199262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:16:52 compute-0 systemd-rc-local-generator[199258]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:16:52 compute-0 systemd[1]: Starting node_exporter container...
Dec 10 10:16:52 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af0f6fc2621abe2c5cca36dcec156ffe0dfcc718eddb9ba5d62a48bac45e6a0/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af0f6fc2621abe2c5cca36dcec156ffe0dfcc718eddb9ba5d62a48bac45e6a0/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:52 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.
Dec 10 10:16:52 compute-0 podman[199273]: 2025-12-10 10:16:52.895441885 +0000 UTC m=+0.101902265 container init 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.905Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.905Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.905Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.906Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.906Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:117 level=info collector=arp
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:117 level=info collector=bcache
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:117 level=info collector=bonding
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.907Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=cpu
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=edac
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=filefd
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=netclass
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=netdev
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=netstat
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=nfs
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=nvme
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.908Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=softnet
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=systemd
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=xfs
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.909Z caller=node_exporter.go:117 level=info collector=zfs
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.910Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 10 10:16:52 compute-0 node_exporter[199288]: ts=2025-12-10T10:16:52.910Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 10 10:16:52 compute-0 podman[199273]: 2025-12-10 10:16:52.918747072 +0000 UTC m=+0.125207452 container start 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:16:52 compute-0 podman[199273]: node_exporter
Dec 10 10:16:52 compute-0 systemd[1]: Started node_exporter container.
Dec 10 10:16:52 compute-0 sudo[199231]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:52 compute-0 podman[199297]: 2025-12-10 10:16:52.980102994 +0000 UTC m=+0.052217797 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:16:53 compute-0 sudo[199470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieajvxnlfwzitgpnxendyhjshlerowas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361813.119354-655-90608469458014/AnsiballZ_systemd.py'
Dec 10 10:16:53 compute-0 sudo[199470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:53 compute-0 python3.9[199472]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:16:53 compute-0 systemd[1]: Stopping node_exporter container...
Dec 10 10:16:53 compute-0 systemd[1]: libpod-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.scope: Deactivated successfully.
Dec 10 10:16:53 compute-0 podman[199476]: 2025-12-10 10:16:53.96213671 +0000 UTC m=+0.057606052 container died 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:16:53 compute-0 systemd[1]: 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12-7cf2781a3a420ecb.timer: Deactivated successfully.
Dec 10 10:16:53 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.
Dec 10 10:16:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12-userdata-shm.mount: Deactivated successfully.
Dec 10 10:16:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-3af0f6fc2621abe2c5cca36dcec156ffe0dfcc718eddb9ba5d62a48bac45e6a0-merged.mount: Deactivated successfully.
Dec 10 10:16:54 compute-0 podman[199476]: 2025-12-10 10:16:54.010896432 +0000 UTC m=+0.106365744 container cleanup 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:16:54 compute-0 podman[199476]: node_exporter
Dec 10 10:16:54 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 10 10:16:54 compute-0 podman[199503]: node_exporter
Dec 10 10:16:54 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 10 10:16:54 compute-0 systemd[1]: Stopped node_exporter container.
Dec 10 10:16:54 compute-0 systemd[1]: Starting node_exporter container...
Dec 10 10:16:54 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af0f6fc2621abe2c5cca36dcec156ffe0dfcc718eddb9ba5d62a48bac45e6a0/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3af0f6fc2621abe2c5cca36dcec156ffe0dfcc718eddb9ba5d62a48bac45e6a0/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:16:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.
Dec 10 10:16:54 compute-0 podman[199516]: 2025-12-10 10:16:54.208789879 +0000 UTC m=+0.127137463 container init 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.225Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.225Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.225Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.226Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.226Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.227Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.227Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.227Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=arp
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=bcache
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=bonding
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=cpu
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=edac
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=filefd
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=netclass
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=netdev
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=netstat
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=nfs
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=nvme
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=softnet
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=systemd
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=xfs
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.228Z caller=node_exporter.go:117 level=info collector=zfs
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.229Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 10 10:16:54 compute-0 node_exporter[199532]: ts=2025-12-10T10:16:54.230Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 10 10:16:54 compute-0 podman[199516]: 2025-12-10 10:16:54.232488567 +0000 UTC m=+0.150835931 container start 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:16:54 compute-0 podman[199516]: node_exporter
Dec 10 10:16:54 compute-0 systemd[1]: Started node_exporter container.
Dec 10 10:16:54 compute-0 sudo[199470]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:54 compute-0 podman[199541]: 2025-12-10 10:16:54.304564788 +0000 UTC m=+0.063611974 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:16:54 compute-0 sudo[199714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vblikygdnxpcfgwgkuouwetzhckujtnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361814.4250557-663-64811806149494/AnsiballZ_stat.py'
Dec 10 10:16:54 compute-0 sudo[199714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:54 compute-0 python3.9[199716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:16:54 compute-0 sudo[199714]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:55 compute-0 sudo[199837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mohrqfdvaqviupramwcelylljojygvit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361814.4250557-663-64811806149494/AnsiballZ_copy.py'
Dec 10 10:16:55 compute-0 sudo[199837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:55 compute-0 python3.9[199839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361814.4250557-663-64811806149494/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:16:55 compute-0 sudo[199837]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:56 compute-0 sudo[199989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktrkccsiyokqrhwqoitubsbwrsnhovri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361815.856378-680-13874287309424/AnsiballZ_container_config_data.py'
Dec 10 10:16:56 compute-0 sudo[199989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:56 compute-0 python3.9[199991]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 10 10:16:56 compute-0 sudo[199989]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:56 compute-0 sudo[200141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osfybvzmfelvpfcpbtfjsdfbornmpscu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361816.6384957-689-190377791179264/AnsiballZ_container_config_hash.py'
Dec 10 10:16:56 compute-0 sudo[200141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:57 compute-0 python3.9[200143]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:16:57 compute-0 sudo[200141]: pam_unix(sudo:session): session closed for user root
Dec 10 10:16:57 compute-0 sudo[200293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pehtyoltqmhgnoerpnyidnhcmtlbcywy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361817.463436-699-150306480848238/AnsiballZ_edpm_container_manage.py'
Dec 10 10:16:57 compute-0 sudo[200293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:16:57 compute-0 python3[200295]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:16:59 compute-0 podman[200309]: 2025-12-10 10:16:59.803377584 +0000 UTC m=+1.714939657 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 10 10:16:59 compute-0 podman[200404]: 2025-12-10 10:16:59.953228128 +0000 UTC m=+0.060704805 container create ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Dec 10 10:16:59 compute-0 podman[200404]: 2025-12-10 10:16:59.92324656 +0000 UTC m=+0.030723277 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 10 10:16:59 compute-0 python3[200295]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 10 10:17:00 compute-0 sudo[200293]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:00 compute-0 sudo[200592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrwpikdbrjxkrjexdbnnkpuovykgefpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361820.4323924-707-245407859748204/AnsiballZ_stat.py'
Dec 10 10:17:00 compute-0 sudo[200592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:01 compute-0 python3.9[200594]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:17:01 compute-0 sudo[200592]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:01 compute-0 sudo[200746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emwiwyskxeukpbfokbmincoexzcxizdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361821.2915118-716-156681016786859/AnsiballZ_file.py'
Dec 10 10:17:01 compute-0 sudo[200746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:01 compute-0 python3.9[200748]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:01 compute-0 sudo[200746]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:02 compute-0 sudo[200897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ribcboxwdmabkirvjiufrliygnikfsqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361821.8649886-716-45354183162285/AnsiballZ_copy.py'
Dec 10 10:17:02 compute-0 sudo[200897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:02 compute-0 python3.9[200899]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361821.8649886-716-45354183162285/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:02 compute-0 sudo[200897]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:02 compute-0 sudo[200973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpiarrbvwevxvbkemigunaazewczytjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361821.8649886-716-45354183162285/AnsiballZ_systemd.py'
Dec 10 10:17:02 compute-0 sudo[200973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:03 compute-0 python3.9[200975]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:17:03 compute-0 systemd[1]: Reloading.
Dec 10 10:17:03 compute-0 systemd-rc-local-generator[201000]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:17:03 compute-0 systemd-sysv-generator[201006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:17:03 compute-0 sudo[200973]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:03 compute-0 sudo[201084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpvmlpigljutdpbvhionrzdfnvwqazbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361821.8649886-716-45354183162285/AnsiballZ_systemd.py'
Dec 10 10:17:03 compute-0 sudo[201084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:04 compute-0 python3.9[201086]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:17:04 compute-0 systemd[1]: Reloading.
Dec 10 10:17:04 compute-0 systemd-sysv-generator[201117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:17:04 compute-0 systemd-rc-local-generator[201114]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:17:04 compute-0 systemd[1]: Starting podman_exporter container...
Dec 10 10:17:04 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:17:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea7a0bbbe2b978da79d5ffbc6350c9c1e8d5d98e851972104f798673d0d2baa/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea7a0bbbe2b978da79d5ffbc6350c9c1e8d5d98e851972104f798673d0d2baa/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:04 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.
Dec 10 10:17:04 compute-0 podman[201126]: 2025-12-10 10:17:04.618614818 +0000 UTC m=+0.145377454 container init ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.636Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.636Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.636Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.636Z caller=handler.go:105 level=info collector=container
Dec 10 10:17:04 compute-0 podman[201126]: 2025-12-10 10:17:04.65398402 +0000 UTC m=+0.180746676 container start ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:17:04 compute-0 podman[201126]: podman_exporter
Dec 10 10:17:04 compute-0 systemd[1]: Starting Podman API Service...
Dec 10 10:17:04 compute-0 systemd[1]: Started Podman API Service.
Dec 10 10:17:04 compute-0 systemd[1]: Started podman_exporter container.
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="Setting parallel job count to 25"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="Using sqlite as database backend"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 10 10:17:04 compute-0 sudo[201084]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:04 compute-0 podman[201153]: @ - - [10/Dec/2025:10:17:04 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 10 10:17:04 compute-0 podman[201153]: time="2025-12-10T10:17:04Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 10 10:17:04 compute-0 podman[201153]: @ - - [10/Dec/2025:10:17:04 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Dec 10 10:17:04 compute-0 podman[201152]: 2025-12-10 10:17:04.757688112 +0000 UTC m=+0.086503170 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.758Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.759Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 10 10:17:04 compute-0 podman_exporter[201142]: ts=2025-12-10T10:17:04.760Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 10 10:17:04 compute-0 systemd[1]: ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9-1ab0d45df6dbd842.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:17:04 compute-0 systemd[1]: ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9-1ab0d45df6dbd842.service: Failed with result 'exit-code'.
Dec 10 10:17:05 compute-0 sudo[201339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naqnoopsfcadihpehjnoxkwneyvslyeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361824.9111335-740-132221333542991/AnsiballZ_systemd.py'
Dec 10 10:17:05 compute-0 sudo[201339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:05 compute-0 python3.9[201341]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:17:05 compute-0 systemd[1]: Stopping podman_exporter container...
Dec 10 10:17:05 compute-0 podman[201153]: @ - - [10/Dec/2025:10:17:04 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec 10 10:17:05 compute-0 systemd[1]: libpod-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.scope: Deactivated successfully.
Dec 10 10:17:05 compute-0 podman[201345]: 2025-12-10 10:17:05.641574366 +0000 UTC m=+0.061460015 container died ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:17:05 compute-0 systemd[1]: ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9-1ab0d45df6dbd842.timer: Deactivated successfully.
Dec 10 10:17:05 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.
Dec 10 10:17:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9-userdata-shm.mount: Deactivated successfully.
Dec 10 10:17:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-dea7a0bbbe2b978da79d5ffbc6350c9c1e8d5d98e851972104f798673d0d2baa-merged.mount: Deactivated successfully.
Dec 10 10:17:06 compute-0 podman[201345]: 2025-12-10 10:17:06.392417809 +0000 UTC m=+0.812303448 container cleanup ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:17:06 compute-0 podman[201345]: podman_exporter
Dec 10 10:17:06 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 10 10:17:06 compute-0 podman[201374]: podman_exporter
Dec 10 10:17:06 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 10 10:17:06 compute-0 systemd[1]: Stopped podman_exporter container.
Dec 10 10:17:06 compute-0 systemd[1]: Starting podman_exporter container...
Dec 10 10:17:06 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:17:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea7a0bbbe2b978da79d5ffbc6350c9c1e8d5d98e851972104f798673d0d2baa/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea7a0bbbe2b978da79d5ffbc6350c9c1e8d5d98e851972104f798673d0d2baa/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.
Dec 10 10:17:06 compute-0 podman[201387]: 2025-12-10 10:17:06.611520636 +0000 UTC m=+0.132005464 container init ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.631Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.631Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.631Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.631Z caller=handler.go:105 level=info collector=container
Dec 10 10:17:06 compute-0 podman[201153]: @ - - [10/Dec/2025:10:17:06 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 10 10:17:06 compute-0 podman[201153]: time="2025-12-10T10:17:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 10 10:17:06 compute-0 podman[201387]: 2025-12-10 10:17:06.635174003 +0000 UTC m=+0.155658801 container start ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:17:06 compute-0 podman[201387]: podman_exporter
Dec 10 10:17:06 compute-0 systemd[1]: Started podman_exporter container.
Dec 10 10:17:06 compute-0 podman[201153]: @ - - [10/Dec/2025:10:17:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19571 "" "Go-http-client/1.1"
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.672Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.673Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 10 10:17:06 compute-0 podman_exporter[201403]: ts=2025-12-10T10:17:06.674Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 10 10:17:06 compute-0 sudo[201339]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:06 compute-0 podman[201412]: 2025-12-10 10:17:06.742943464 +0000 UTC m=+0.094685449 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:17:07 compute-0 sudo[201588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfeunjdqtfjqgjzyghvigccgkvayxpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361826.8767178-748-2892171513530/AnsiballZ_stat.py'
Dec 10 10:17:07 compute-0 sudo[201588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:07 compute-0 python3.9[201590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:07 compute-0 sudo[201588]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:07 compute-0 auditd[701]: Audit daemon rotating log files
Dec 10 10:17:07 compute-0 sudo[201724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqjtjrypnuyidlsebpjrzjxhosccfdjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361826.8767178-748-2892171513530/AnsiballZ_copy.py'
Dec 10 10:17:07 compute-0 sudo[201724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:07 compute-0 podman[201685]: 2025-12-10 10:17:07.816969847 +0000 UTC m=+0.088999017 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:17:07 compute-0 python3.9[201730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765361826.8767178-748-2892171513530/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 10 10:17:08 compute-0 sudo[201724]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:08 compute-0 sudo[201882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbtsppvbeddcngxmoiswsiacitnbtkku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361828.2725337-765-133867729561383/AnsiballZ_container_config_data.py'
Dec 10 10:17:08 compute-0 sudo[201882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:08 compute-0 python3.9[201884]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 10 10:17:08 compute-0 sudo[201882]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:09 compute-0 sudo[202034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkbshkypclrcqbrtewklciaxicsyhuvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361829.0593224-774-45653112970083/AnsiballZ_container_config_hash.py'
Dec 10 10:17:09 compute-0 sudo[202034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:09 compute-0 python3.9[202036]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 10 10:17:09 compute-0 sudo[202034]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:10 compute-0 sudo[202186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekvkcjowqjdlrzpixldqxbmvxouldldd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361829.9054012-784-140007670748/AnsiballZ_edpm_container_manage.py'
Dec 10 10:17:10 compute-0 sudo[202186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:10 compute-0 python3[202188]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 10 10:17:14 compute-0 podman[202246]: 2025-12-10 10:17:14.003756843 +0000 UTC m=+1.017374417 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:17:14 compute-0 podman[202273]: 2025-12-10 10:17:14.310911127 +0000 UTC m=+0.327057459 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:17:14 compute-0 podman[202202]: 2025-12-10 10:17:14.318951797 +0000 UTC m=+3.598603223 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 10 10:17:14 compute-0 podman[202321]: 2025-12-10 10:17:14.412057695 +0000 UTC m=+0.063128229 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute)
Dec 10 10:17:14 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-1226dbc449d54e1.service: Main process exited, code=exited, status=1/FAILURE
Dec 10 10:17:14 compute-0 systemd[1]: 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb-1226dbc449d54e1.service: Failed with result 'exit-code'.
Dec 10 10:17:14 compute-0 podman[202362]: 2025-12-10 10:17:14.472667073 +0000 UTC m=+0.049071433 container create 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Dec 10 10:17:14 compute-0 podman[202362]: 2025-12-10 10:17:14.449795107 +0000 UTC m=+0.026199487 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 10 10:17:14 compute-0 python3[202188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 10 10:17:14 compute-0 sudo[202186]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:15 compute-0 sudo[202551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhecuaaodwqcgwmpegtcmwbktfcmlpqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361834.7461655-792-201679778468201/AnsiballZ_stat.py'
Dec 10 10:17:15 compute-0 sudo[202551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:15 compute-0 python3.9[202553]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:17:15 compute-0 sudo[202551]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:15 compute-0 sudo[202705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqfnkjjssmfjupyredhyjzmrwotshqgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361835.5420566-801-187720156689226/AnsiballZ_file.py'
Dec 10 10:17:15 compute-0 sudo[202705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:16 compute-0 python3.9[202707]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:16 compute-0 sudo[202705]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:16 compute-0 sudo[202856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-potnfcxnotpggglfimpebvtooqriwjns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361836.263827-801-120821972404981/AnsiballZ_copy.py'
Dec 10 10:17:16 compute-0 sudo[202856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:16 compute-0 python3.9[202858]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765361836.263827-801-120821972404981/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:16 compute-0 sudo[202856]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:17 compute-0 sudo[202932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlcbsfytkqawetpmbubhljupdgpsnyun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361836.263827-801-120821972404981/AnsiballZ_systemd.py'
Dec 10 10:17:17 compute-0 sudo[202932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:17 compute-0 python3.9[202934]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 10 10:17:17 compute-0 systemd[1]: Reloading.
Dec 10 10:17:17 compute-0 systemd-rc-local-generator[202957]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:17:17 compute-0 systemd-sysv-generator[202964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:17:17 compute-0 sudo[202932]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:18 compute-0 sudo[203043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwdjdqikzcacovlcjvjnwymleghfkvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361836.263827-801-120821972404981/AnsiballZ_systemd.py'
Dec 10 10:17:18 compute-0 sudo[203043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:18 compute-0 python3.9[203045]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 10 10:17:18 compute-0 systemd[1]: Reloading.
Dec 10 10:17:18 compute-0 systemd-rc-local-generator[203066]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 10 10:17:18 compute-0 systemd-sysv-generator[203072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 10 10:17:18 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 10 10:17:18 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:17:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:19 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.
Dec 10 10:17:19 compute-0 podman[203085]: 2025-12-10 10:17:19.032517293 +0000 UTC m=+0.123999364 container init 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64)
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *bridge.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *coverage.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *datapath.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *iface.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *memory.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *ovnnorthd.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *ovn.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *ovsdbserver.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *pmd_perf.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *pmd_rxq.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: INFO    10:17:19 main.go:48: registering *vswitch.Collector
Dec 10 10:17:19 compute-0 openstack_network_exporter[203100]: NOTICE  10:17:19 main.go:76: listening on https://:9105/metrics
Dec 10 10:17:19 compute-0 podman[203085]: 2025-12-10 10:17:19.060364944 +0000 UTC m=+0.151846945 container start 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64)
Dec 10 10:17:19 compute-0 podman[203085]: openstack_network_exporter
Dec 10 10:17:19 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 10 10:17:19 compute-0 sudo[203043]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:19 compute-0 podman[203105]: 2025-12-10 10:17:19.171407073 +0000 UTC m=+0.094771954 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 10 10:17:19 compute-0 sudo[203282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enhihxfbxluitgxnibdybekaoqqwsduv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361839.2899837-825-228894886632104/AnsiballZ_systemd.py'
Dec 10 10:17:19 compute-0 sudo[203282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:19 compute-0 python3.9[203284]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 10 10:17:19 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Dec 10 10:17:20 compute-0 systemd[1]: libpod-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope: Deactivated successfully.
Dec 10 10:17:20 compute-0 conmon[203100]: conmon 70573056c2f1509f7aea <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope/container/memory.events
Dec 10 10:17:20 compute-0 podman[203288]: 2025-12-10 10:17:20.027870056 +0000 UTC m=+0.058856552 container died 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Dec 10 10:17:20 compute-0 systemd[1]: 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e-5bfab28b7b3e862.timer: Deactivated successfully.
Dec 10 10:17:20 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.
Dec 10 10:17:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e-userdata-shm.mount: Deactivated successfully.
Dec 10 10:17:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3-merged.mount: Deactivated successfully.
Dec 10 10:17:20 compute-0 podman[203288]: 2025-12-10 10:17:20.997775253 +0000 UTC m=+1.028761759 container cleanup 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 10 10:17:20 compute-0 podman[203288]: openstack_network_exporter
Dec 10 10:17:21 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 10 10:17:21 compute-0 podman[203317]: openstack_network_exporter
Dec 10 10:17:21 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 10 10:17:21 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Dec 10 10:17:21 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 10 10:17:21 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:17:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91e273b56ffc365154ab98c84d930a303d5814780658e4e05841100ac4f39ae3/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 10 10:17:21 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.
Dec 10 10:17:21 compute-0 podman[203330]: 2025-12-10 10:17:21.224887817 +0000 UTC m=+0.133873843 container init 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *bridge.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *coverage.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *datapath.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *iface.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *memory.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *ovnnorthd.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *ovn.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *ovsdbserver.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *pmd_perf.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *pmd_rxq.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: INFO    10:17:21 main.go:48: registering *vswitch.Collector
Dec 10 10:17:21 compute-0 openstack_network_exporter[203346]: NOTICE  10:17:21 main.go:76: listening on https://:9105/metrics
Dec 10 10:17:21 compute-0 podman[203330]: 2025-12-10 10:17:21.249526172 +0000 UTC m=+0.158512168 container start 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 10 10:17:21 compute-0 podman[203330]: openstack_network_exporter
Dec 10 10:17:21 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 10 10:17:21 compute-0 sudo[203282]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:21 compute-0 podman[203356]: 2025-12-10 10:17:21.316524535 +0000 UTC m=+0.056180958 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible)
Dec 10 10:17:21 compute-0 sudo[203527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpeivfgmdwelffnqilrlemjezvanfcsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361841.460978-833-39038619506822/AnsiballZ_find.py'
Dec 10 10:17:21 compute-0 sudo[203527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:21 compute-0 python3.9[203529]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 10 10:17:21 compute-0 sudo[203527]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:22 compute-0 sudo[203679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eokhyngrhsrvzfjcbtzenyaxzahgkipg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361842.2713027-843-118543976293025/AnsiballZ_podman_container_info.py'
Dec 10 10:17:22 compute-0 sudo[203679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:22 compute-0 python3.9[203681]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 10 10:17:23 compute-0 sudo[203679]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:23 compute-0 sudo[203844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afnlkjdbykawicxexwwszeuijvdrhhhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361843.2205079-851-256810594475680/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:23 compute-0 sudo[203844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:23 compute-0 python3.9[203846]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:24 compute-0 systemd[1]: Started libpod-conmon-e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6.scope.
Dec 10 10:17:24 compute-0 podman[203847]: 2025-12-10 10:17:24.013053144 +0000 UTC m=+0.081221014 container exec e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:17:24 compute-0 podman[203847]: 2025-12-10 10:17:24.048166605 +0000 UTC m=+0.116334465 container exec_died e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 10 10:17:24 compute-0 systemd[1]: libpod-conmon-e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6.scope: Deactivated successfully.
Dec 10 10:17:24 compute-0 sudo[203844]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:24 compute-0 sudo[204033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpzqeompmvjujcgttpqjkpmtakoymmow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361844.2804682-859-97515639944247/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:24 compute-0 sudo[204033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:24 compute-0 podman[204001]: 2025-12-10 10:17:24.682663435 +0000 UTC m=+0.085886651 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:17:24 compute-0 python3.9[204041]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:24 compute-0 systemd[1]: Started libpod-conmon-e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6.scope.
Dec 10 10:17:24 compute-0 podman[204053]: 2025-12-10 10:17:24.905861361 +0000 UTC m=+0.069030729 container exec e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:17:24 compute-0 podman[204053]: 2025-12-10 10:17:24.915086254 +0000 UTC m=+0.078255602 container exec_died e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 10 10:17:24 compute-0 sudo[204033]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:24 compute-0 systemd[1]: libpod-conmon-e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6.scope: Deactivated successfully.
Dec 10 10:17:25 compute-0 sudo[204234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmvfmwoaesbdzrkfzmczfluxlrngtjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361845.134982-867-150263509955757/AnsiballZ_file.py'
Dec 10 10:17:25 compute-0 sudo[204234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:25 compute-0 python3.9[204236]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:25 compute-0 sudo[204234]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:26 compute-0 sudo[204386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsgjxgevberadihgccxxordlpppeeyxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361845.8965077-876-228069787231556/AnsiballZ_podman_container_info.py'
Dec 10 10:17:26 compute-0 sudo[204386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:26 compute-0 python3.9[204388]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 10 10:17:26 compute-0 sudo[204386]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:27 compute-0 sudo[204551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpihhgbiexjtdmoqxdjdtdebdrdpbtha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361846.7393475-884-16192772482038/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:27 compute-0 sudo[204551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:27 compute-0 python3.9[204553]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:27 compute-0 systemd[1]: Started libpod-conmon-3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4.scope.
Dec 10 10:17:27 compute-0 podman[204554]: 2025-12-10 10:17:27.402595784 +0000 UTC m=+0.105351064 container exec 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:17:27 compute-0 podman[204554]: 2025-12-10 10:17:27.438088845 +0000 UTC m=+0.140844075 container exec_died 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 10 10:17:27 compute-0 sudo[204551]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:27 compute-0 systemd[1]: libpod-conmon-3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4.scope: Deactivated successfully.
Dec 10 10:17:27 compute-0 sudo[204736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akhoyvzqexietuptohibunazxxqsrqcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361847.6423037-892-169548700973768/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:27 compute-0 sudo[204736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:28 compute-0 python3.9[204738]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:28 compute-0 systemd[1]: Started libpod-conmon-3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4.scope.
Dec 10 10:17:28 compute-0 podman[204739]: 2025-12-10 10:17:28.233503078 +0000 UTC m=+0.074487199 container exec 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:17:28 compute-0 podman[204739]: 2025-12-10 10:17:28.268023743 +0000 UTC m=+0.109007854 container exec_died 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:17:28 compute-0 systemd[1]: libpod-conmon-3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4.scope: Deactivated successfully.
Dec 10 10:17:28 compute-0 sudo[204736]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:28 compute-0 sudo[204921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quvnyzbrsvsimjqgupvivqgyodcwrkhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361848.4771562-900-253014244720078/AnsiballZ_file.py'
Dec 10 10:17:28 compute-0 sudo[204921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:28 compute-0 python3.9[204923]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:28 compute-0 sudo[204921]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:29 compute-0 sudo[205073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rajpsvfkpaerkgsiiyetznowdeyzpzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361849.184344-909-73504730378754/AnsiballZ_podman_container_info.py'
Dec 10 10:17:29 compute-0 sudo[205073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:29 compute-0 python3.9[205075]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 10 10:17:29 compute-0 sudo[205073]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:30 compute-0 sudo[205238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivvqjdknwrxyctqyzuqrhinypcvyfohd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361849.9123695-917-103534318534876/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:30 compute-0 sudo[205238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:30 compute-0 python3.9[205240]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:30 compute-0 systemd[1]: Started libpod-conmon-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope.
Dec 10 10:17:30 compute-0 podman[205241]: 2025-12-10 10:17:30.49195868 +0000 UTC m=+0.062705527 container exec 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:17:30 compute-0 podman[205241]: 2025-12-10 10:17:30.523296747 +0000 UTC m=+0.094043604 container exec_died 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 10 10:17:30 compute-0 systemd[1]: libpod-conmon-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope: Deactivated successfully.
Dec 10 10:17:30 compute-0 sudo[205238]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:31 compute-0 sudo[205421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekngvqphqmthyovphfxzeapstfwztfqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361850.738166-925-140052447648382/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:31 compute-0 sudo[205421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:31 compute-0 python3.9[205423]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:31 compute-0 systemd[1]: Started libpod-conmon-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope.
Dec 10 10:17:31 compute-0 podman[205424]: 2025-12-10 10:17:31.398047581 +0000 UTC m=+0.109297251 container exec 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 10 10:17:31 compute-0 podman[205424]: 2025-12-10 10:17:31.433628145 +0000 UTC m=+0.144877825 container exec_died 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Dec 10 10:17:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:17:31.457 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:17:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:17:31.459 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:17:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:17:31.459 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:17:31 compute-0 systemd[1]: libpod-conmon-16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a.scope: Deactivated successfully.
Dec 10 10:17:31 compute-0 sudo[205421]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:31 compute-0 sudo[205603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cptixexibauttqpsfgkxghvgqnsjbvkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361851.6764247-933-62729850066996/AnsiballZ_file.py'
Dec 10 10:17:31 compute-0 sudo[205603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:32 compute-0 python3.9[205605]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:32 compute-0 sudo[205603]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:32 compute-0 sudo[205755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiwabpawztyayxvcfximipszxrzqarmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361852.4623408-942-90325775121025/AnsiballZ_podman_container_info.py'
Dec 10 10:17:32 compute-0 sudo[205755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:32 compute-0 python3.9[205757]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 10 10:17:33 compute-0 sudo[205755]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:33 compute-0 sudo[205920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcwoowxbuluukedtjmdwglecyifarvij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361853.1866062-950-101691994190795/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:33 compute-0 sudo[205920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:33 compute-0 python3.9[205922]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:33 compute-0 systemd[1]: Started libpod-conmon-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.scope.
Dec 10 10:17:33 compute-0 podman[205923]: 2025-12-10 10:17:33.812178443 +0000 UTC m=+0.092915643 container exec 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 10 10:17:33 compute-0 podman[205923]: 2025-12-10 10:17:33.850142812 +0000 UTC m=+0.130879922 container exec_died 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:17:33 compute-0 systemd[1]: libpod-conmon-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.scope: Deactivated successfully.
Dec 10 10:17:33 compute-0 sudo[205920]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:34 compute-0 sudo[206104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpeisbepxzgyodlibrlksmuimklktjif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361854.1230166-958-99364157931145/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:34 compute-0 sudo[206104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:34 compute-0 python3.9[206106]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:34 compute-0 systemd[1]: Started libpod-conmon-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.scope.
Dec 10 10:17:34 compute-0 podman[206107]: 2025-12-10 10:17:34.772332184 +0000 UTC m=+0.093930661 container exec 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:17:34 compute-0 podman[206107]: 2025-12-10 10:17:34.805257915 +0000 UTC m=+0.126856402 container exec_died 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 10 10:17:34 compute-0 systemd[1]: libpod-conmon-1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb.scope: Deactivated successfully.
Dec 10 10:17:34 compute-0 sudo[206104]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:35 compute-0 sudo[206287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgvealwjyxtpkoidzzoybuvvnhmnpxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361855.0301075-966-19383029248388/AnsiballZ_file.py'
Dec 10 10:17:35 compute-0 sudo[206287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:35 compute-0 python3.9[206289]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:35 compute-0 sudo[206287]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:36 compute-0 sudo[206439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmeziadkngecfnrtkcekzozjcrzxwadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361855.7413783-975-23675561006822/AnsiballZ_podman_container_info.py'
Dec 10 10:17:36 compute-0 sudo[206439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:36 compute-0 python3.9[206441]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 10 10:17:36 compute-0 sudo[206439]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:36 compute-0 sudo[206604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zelvfdpkcqmoesrfsmzuekhaxtytgwyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361856.486885-983-74790415656950/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:36 compute-0 sudo[206604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:36 compute-0 podman[206606]: 2025-12-10 10:17:36.862764909 +0000 UTC m=+0.055604243 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:17:37 compute-0 python3.9[206607]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:37 compute-0 systemd[1]: Started libpod-conmon-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.scope.
Dec 10 10:17:37 compute-0 podman[206631]: 2025-12-10 10:17:37.115572386 +0000 UTC m=+0.085426088 container exec 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:17:37 compute-0 podman[206631]: 2025-12-10 10:17:37.145934296 +0000 UTC m=+0.115787978 container exec_died 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:17:37 compute-0 systemd[1]: libpod-conmon-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.scope: Deactivated successfully.
Dec 10 10:17:37 compute-0 sudo[206604]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:37 compute-0 sudo[206812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuplxfjsqkdwpysrdiltzrtfinekeegw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361857.3433259-991-31187586861890/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:37 compute-0 sudo[206812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:37 compute-0 python3.9[206814]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:37 compute-0 systemd[1]: Started libpod-conmon-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.scope.
Dec 10 10:17:37 compute-0 podman[206815]: 2025-12-10 10:17:37.9143356 +0000 UTC m=+0.074858638 container exec 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:17:37 compute-0 podman[206815]: 2025-12-10 10:17:37.954452508 +0000 UTC m=+0.114975476 container exec_died 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:17:37 compute-0 systemd[1]: libpod-conmon-926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12.scope: Deactivated successfully.
Dec 10 10:17:37 compute-0 sudo[206812]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:38 compute-0 podman[206831]: 2025-12-10 10:17:38.00166088 +0000 UTC m=+0.085121620 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 10 10:17:38 compute-0 sudo[207011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnmszaulekhdabkvqvcnidwrjbhypxpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361858.1506686-999-235943241206642/AnsiballZ_file.py'
Dec 10 10:17:38 compute-0 sudo[207011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:38 compute-0 python3.9[207013]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:38 compute-0 sudo[207011]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:39 compute-0 sudo[207163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dotguyukhyzhwfottuzpgmaclgzyvfjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361858.825972-1008-134690281114794/AnsiballZ_podman_container_info.py'
Dec 10 10:17:39 compute-0 sudo[207163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:39 compute-0 python3.9[207165]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 10 10:17:39 compute-0 sudo[207163]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:39 compute-0 sudo[207328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijwzawcdykoxdbrypumhwrmzccwvngrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361859.603768-1016-270766972011120/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:39 compute-0 sudo[207328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:40 compute-0 python3.9[207330]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:40 compute-0 systemd[1]: Started libpod-conmon-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.scope.
Dec 10 10:17:40 compute-0 podman[207331]: 2025-12-10 10:17:40.24723944 +0000 UTC m=+0.119402398 container exec ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:17:40 compute-0 podman[207331]: 2025-12-10 10:17:40.279098902 +0000 UTC m=+0.151261840 container exec_died ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:17:40 compute-0 systemd[1]: libpod-conmon-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.scope: Deactivated successfully.
Dec 10 10:17:40 compute-0 sudo[207328]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:40 compute-0 sudo[207514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfsffiolnsexgkohkjfvdmvetxtrwajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361860.4687178-1024-137280066561330/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:40 compute-0 sudo[207514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:41 compute-0 python3.9[207516]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:41 compute-0 systemd[1]: Started libpod-conmon-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.scope.
Dec 10 10:17:41 compute-0 podman[207517]: 2025-12-10 10:17:41.1046759 +0000 UTC m=+0.085972433 container exec ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:17:41 compute-0 podman[207517]: 2025-12-10 10:17:41.141131528 +0000 UTC m=+0.122428011 container exec_died ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:17:41 compute-0 systemd[1]: libpod-conmon-ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9.scope: Deactivated successfully.
Dec 10 10:17:41 compute-0 sudo[207514]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:41 compute-0 sudo[207698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unmihatyhhrflzugsbprorctofgckuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361861.345451-1032-148534954276425/AnsiballZ_file.py'
Dec 10 10:17:41 compute-0 sudo[207698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:41 compute-0 python3.9[207700]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:41 compute-0 sudo[207698]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:42 compute-0 sudo[207850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdwaaorlfjxhdrdhvryjnjlttccygyrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361862.1134098-1041-73787296703239/AnsiballZ_podman_container_info.py'
Dec 10 10:17:42 compute-0 sudo[207850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:42 compute-0 python3.9[207852]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 10 10:17:42 compute-0 sudo[207850]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:43 compute-0 sudo[208016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmiuillsigosukiiyygbzkekbexxdfqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361862.837264-1049-181112475042783/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:43 compute-0 sudo[208016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.210 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.212 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.237 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.238 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.238 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.238 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:17:43 compute-0 python3.9[208018]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:43 compute-0 systemd[1]: Started libpod-conmon-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope.
Dec 10 10:17:43 compute-0 podman[208019]: 2025-12-10 10:17:43.501504219 +0000 UTC m=+0.136183547 container exec 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6)
Dec 10 10:17:43 compute-0 podman[208019]: 2025-12-10 10:17:43.537237607 +0000 UTC m=+0.171916885 container exec_died 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Dec 10 10:17:43 compute-0 systemd[1]: libpod-conmon-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope: Deactivated successfully.
Dec 10 10:17:43 compute-0 sudo[208016]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:17:43 compute-0 nova_compute[186989]: 2025-12-10 10:17:43.938 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:17:44 compute-0 sudo[208200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whntetkwwebznjspnpqluuilpydiutmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361863.7548401-1057-227233161837316/AnsiballZ_podman_container_exec.py'
Dec 10 10:17:44 compute-0 sudo[208200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:44 compute-0 python3.9[208202]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 10 10:17:44 compute-0 systemd[1]: Started libpod-conmon-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope.
Dec 10 10:17:44 compute-0 podman[208203]: 2025-12-10 10:17:44.304308284 +0000 UTC m=+0.076497684 container exec 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 10 10:17:44 compute-0 podman[208203]: 2025-12-10 10:17:44.338214912 +0000 UTC m=+0.110404332 container exec_died 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public)
Dec 10 10:17:44 compute-0 systemd[1]: libpod-conmon-70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e.scope: Deactivated successfully.
Dec 10 10:17:44 compute-0 sudo[208200]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:44 compute-0 podman[208220]: 2025-12-10 10:17:44.421362037 +0000 UTC m=+0.100516351 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 10 10:17:44 compute-0 podman[208249]: 2025-12-10 10:17:44.474392547 +0000 UTC m=+0.077191933 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible)
Dec 10 10:17:44 compute-0 podman[208281]: 2025-12-10 10:17:44.508422269 +0000 UTC m=+0.061635857 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:17:44 compute-0 sudo[208445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqzfvhmulghtmbzgmvrcidbqyrwelwxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361864.5486135-1065-208883763025252/AnsiballZ_file.py'
Dec 10 10:17:44 compute-0 sudo[208445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:17:44 compute-0 nova_compute[186989]: 2025-12-10 10:17:44.952 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:17:45 compute-0 python3.9[208447]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:45 compute-0 sudo[208445]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.117 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.119 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5866MB free_disk=73.36400985717773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.119 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.119 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.173 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.173 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.191 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.212 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.213 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:17:45 compute-0 nova_compute[186989]: 2025-12-10 10:17:45.214 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:17:45 compute-0 sudo[208597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmoiqyrbjyobkbemeroufbqplztnewpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361865.289298-1074-140364540495127/AnsiballZ_file.py'
Dec 10 10:17:45 compute-0 sudo[208597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:45 compute-0 python3.9[208599]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:45 compute-0 sudo[208597]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:46 compute-0 sudo[208749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iayfmrmnkkidqtzajarymzzxqheqmmhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361865.9514985-1082-69410335604373/AnsiballZ_stat.py'
Dec 10 10:17:46 compute-0 sudo[208749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:46 compute-0 python3.9[208751]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:46 compute-0 sudo[208749]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:46 compute-0 sudo[208872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yavapuqzlsjgwujkxvimjoribwiftwuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361865.9514985-1082-69410335604373/AnsiballZ_copy.py'
Dec 10 10:17:46 compute-0 sudo[208872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:47 compute-0 python3.9[208874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765361865.9514985-1082-69410335604373/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:47 compute-0 sudo[208872]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:47 compute-0 sudo[209024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujpetomulenhdyniwurdjsitnzqliqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361867.2961283-1098-166624184481249/AnsiballZ_file.py'
Dec 10 10:17:47 compute-0 sudo[209024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:47 compute-0 python3.9[209026]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:47 compute-0 sudo[209024]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:48 compute-0 sudo[209176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifwksirgkjvgldktlerwdwmwybiycow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361867.955781-1106-57534992156216/AnsiballZ_stat.py'
Dec 10 10:17:48 compute-0 sudo[209176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:48 compute-0 python3.9[209178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:48 compute-0 sudo[209176]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:48 compute-0 sudo[209254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dasdqjdgoyxcfwuofbmqqswjmiyvkcmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361867.955781-1106-57534992156216/AnsiballZ_file.py'
Dec 10 10:17:48 compute-0 sudo[209254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:48 compute-0 python3.9[209256]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:49 compute-0 sudo[209254]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:49 compute-0 sudo[209406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awdhcfflzataofhctxpbhprrrkjwmcvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361869.224275-1118-277550697599015/AnsiballZ_stat.py'
Dec 10 10:17:49 compute-0 sudo[209406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:49 compute-0 python3.9[209408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:49 compute-0 sudo[209406]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:50 compute-0 sudo[209484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjrpsvhimfgfxtsbbjnaqjeklgazcuiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361869.224275-1118-277550697599015/AnsiballZ_file.py'
Dec 10 10:17:50 compute-0 sudo[209484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:50 compute-0 python3.9[209486]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.u3bhij0x recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:50 compute-0 sudo[209484]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:50 compute-0 sudo[209636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ketrhsxqiyjkaqsknsxmhgdovqagxqdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361870.47668-1130-151309495385114/AnsiballZ_stat.py'
Dec 10 10:17:50 compute-0 sudo[209636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:50 compute-0 python3.9[209638]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:51 compute-0 sudo[209636]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:51 compute-0 sudo[209714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvskggswgbgtnsrpomtpkjrbskpdlako ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361870.47668-1130-151309495385114/AnsiballZ_file.py'
Dec 10 10:17:51 compute-0 sudo[209714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:51 compute-0 python3.9[209716]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:51 compute-0 sudo[209714]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:52 compute-0 sudo[209879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxwhqtupzvjrgimbynmfskrtntgpnrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361871.7053673-1143-56973067467482/AnsiballZ_command.py'
Dec 10 10:17:52 compute-0 sudo[209879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:52 compute-0 podman[209840]: 2025-12-10 10:17:52.051693947 +0000 UTC m=+0.096551793 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 10 10:17:52 compute-0 python3.9[209887]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:17:52 compute-0 sudo[209879]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:52 compute-0 sudo[210040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmqcivwefmrxqnxoaehtatuuwcdpzgto ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765361872.4390342-1151-139972583047723/AnsiballZ_edpm_nftables_from_files.py'
Dec 10 10:17:52 compute-0 sudo[210040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:53 compute-0 python3[210042]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 10 10:17:53 compute-0 sudo[210040]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:53 compute-0 sudo[210192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxgcsfjkeuxnyxqzwjxcmfxxescqaart ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361873.2760518-1159-42622423111578/AnsiballZ_stat.py'
Dec 10 10:17:53 compute-0 sudo[210192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:53 compute-0 python3.9[210194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:53 compute-0 sudo[210192]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:54 compute-0 sudo[210270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfathiiqsqrmlovxeblylpyueursjocl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361873.2760518-1159-42622423111578/AnsiballZ_file.py'
Dec 10 10:17:54 compute-0 sudo[210270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:54 compute-0 python3.9[210272]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:54 compute-0 sudo[210270]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:54 compute-0 sudo[210423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aezotktkozdfmqfgxcyfmbmvcpzuqrux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361874.4021096-1171-208309181563443/AnsiballZ_stat.py'
Dec 10 10:17:54 compute-0 sudo[210423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:54 compute-0 podman[210396]: 2025-12-10 10:17:54.933619078 +0000 UTC m=+0.068069254 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:17:54 compute-0 python3.9[210425]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:55 compute-0 sudo[210423]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:55 compute-0 sudo[210524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlvpjyzehrhthgalsfhaoiggrqmkrhvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361874.4021096-1171-208309181563443/AnsiballZ_file.py'
Dec 10 10:17:55 compute-0 sudo[210524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:55 compute-0 python3.9[210526]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:55 compute-0 sudo[210524]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:55 compute-0 sudo[210676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucjqiykoydsrgetwsxktmaoonyrprvro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361875.6671498-1183-99643940170279/AnsiballZ_stat.py'
Dec 10 10:17:55 compute-0 sudo[210676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:56 compute-0 python3.9[210678]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:56 compute-0 sudo[210676]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:56 compute-0 sudo[210754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxkrqdqaklxylqqnyyqzqjnshswlmjbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361875.6671498-1183-99643940170279/AnsiballZ_file.py'
Dec 10 10:17:56 compute-0 sudo[210754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:56 compute-0 python3.9[210756]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:56 compute-0 sudo[210754]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:57 compute-0 sudo[210906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxcqptngspixlmjaiaimwkoiehtqunr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361876.8105538-1195-75845075517090/AnsiballZ_stat.py'
Dec 10 10:17:57 compute-0 sudo[210906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:57 compute-0 python3.9[210908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:57 compute-0 sudo[210906]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:57 compute-0 sudo[210984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptfbthjrirqyccelumifhmhxlahuefnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361876.8105538-1195-75845075517090/AnsiballZ_file.py'
Dec 10 10:17:57 compute-0 sudo[210984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:57 compute-0 python3.9[210986]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:57 compute-0 sudo[210984]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:58 compute-0 sudo[211136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqaihzfxnhkgcarszumwspzwrzecfchb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361877.9198785-1207-85508736724467/AnsiballZ_stat.py'
Dec 10 10:17:58 compute-0 sudo[211136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:58 compute-0 python3.9[211138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 10 10:17:58 compute-0 sudo[211136]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:58 compute-0 sudo[211261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htwnguebugtfkvcwwrywzlcrbhapqwgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361877.9198785-1207-85508736724467/AnsiballZ_copy.py'
Dec 10 10:17:58 compute-0 sudo[211261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:59 compute-0 python3.9[211263]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765361877.9198785-1207-85508736724467/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:59 compute-0 sudo[211261]: pam_unix(sudo:session): session closed for user root
Dec 10 10:17:59 compute-0 sudo[211413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irmuulknrhojnfxrmqrqqgxfloptvjkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361879.2831051-1222-39362270178565/AnsiballZ_file.py'
Dec 10 10:17:59 compute-0 sudo[211413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:17:59 compute-0 python3.9[211415]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:17:59 compute-0 sudo[211413]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:00 compute-0 sudo[211565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsbxshofjcqpkvgqjusjnwasrzmpavio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361879.9575531-1230-37595990270021/AnsiballZ_command.py'
Dec 10 10:18:00 compute-0 sudo[211565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:00 compute-0 python3.9[211567]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:18:00 compute-0 sudo[211565]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:01 compute-0 sudo[211720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fycxibxxesgrprovhfptmhqzahmrubbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361880.7080433-1238-233107623628425/AnsiballZ_blockinfile.py'
Dec 10 10:18:01 compute-0 sudo[211720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:01 compute-0 python3.9[211722]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:18:01 compute-0 sudo[211720]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:01 compute-0 sudo[211872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aycehmfftmefupbumlggicbbkzgpjcqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361881.6432016-1247-89651093885762/AnsiballZ_command.py'
Dec 10 10:18:01 compute-0 sudo[211872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:02 compute-0 python3.9[211874]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:18:02 compute-0 sudo[211872]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:02 compute-0 sudo[212025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zosmybftcqlgaowzticwtpteqonoaqik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361882.3651175-1255-84013251835167/AnsiballZ_stat.py'
Dec 10 10:18:02 compute-0 sudo[212025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:03 compute-0 python3.9[212027]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 10 10:18:03 compute-0 sudo[212025]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:03 compute-0 sudo[212179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deogkbodkfljadgojwueeyxpkmbnpeqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361883.2186787-1263-168044263010492/AnsiballZ_command.py'
Dec 10 10:18:03 compute-0 sudo[212179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:03 compute-0 python3.9[212181]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 10 10:18:03 compute-0 sudo[212179]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:04 compute-0 sudo[212334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vixwksiytqeenfbxtpcgslkwouuinxdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765361883.8711932-1271-258613313920015/AnsiballZ_file.py'
Dec 10 10:18:04 compute-0 sudo[212334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:18:04 compute-0 python3.9[212336]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 10 10:18:04 compute-0 sudo[212334]: pam_unix(sudo:session): session closed for user root
Dec 10 10:18:04 compute-0 sshd-session[187361]: Connection closed by 192.168.122.30 port 40772
Dec 10 10:18:04 compute-0 sshd-session[187358]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:18:04 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Dec 10 10:18:04 compute-0 systemd[1]: session-26.scope: Consumed 1min 47.554s CPU time.
Dec 10 10:18:04 compute-0 systemd-logind[787]: Session 26 logged out. Waiting for processes to exit.
Dec 10 10:18:04 compute-0 systemd-logind[787]: Removed session 26.
Dec 10 10:18:07 compute-0 podman[212361]: 2025-12-10 10:18:07.066984883 +0000 UTC m=+0.067397055 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:18:09 compute-0 podman[212386]: 2025-12-10 10:18:09.022812575 +0000 UTC m=+0.061539174 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Dec 10 10:18:15 compute-0 podman[212406]: 2025-12-10 10:18:15.031225639 +0000 UTC m=+0.056900618 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:18:15 compute-0 podman[212405]: 2025-12-10 10:18:15.049412326 +0000 UTC m=+0.074166220 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 10 10:18:15 compute-0 podman[212407]: 2025-12-10 10:18:15.086025069 +0000 UTC m=+0.098829046 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 10 10:18:23 compute-0 podman[212470]: 2025-12-10 10:18:23.027588523 +0000 UTC m=+0.065221800 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Dec 10 10:18:26 compute-0 podman[212489]: 2025-12-10 10:18:26.03364565 +0000 UTC m=+0.065000344 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:18:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:31.458 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:18:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:31.459 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:18:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:31.460 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:18:38 compute-0 podman[212514]: 2025-12-10 10:18:38.010200598 +0000 UTC m=+0.056782631 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:18:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:39.126 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:18:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:39.128 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:18:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:18:39.129 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:18:40 compute-0 podman[212539]: 2025-12-10 10:18:40.065837484 +0000 UTC m=+0.099801004 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.214 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.214 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:18:43 compute-0 nova_compute[186989]: 2025-12-10 10:18:43.935 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:44 compute-0 nova_compute[186989]: 2025-12-10 10:18:44.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.422 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.422 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.423 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.424 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:18:45.425 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:18:46 compute-0 podman[212561]: 2025-12-10 10:18:46.028083555 +0000 UTC m=+0.069327895 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 10 10:18:46 compute-0 podman[212559]: 2025-12-10 10:18:46.040429384 +0000 UTC m=+0.086324335 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:18:46 compute-0 podman[212560]: 2025-12-10 10:18:46.041329728 +0000 UTC m=+0.085463610 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:18:46 compute-0 nova_compute[186989]: 2025-12-10 10:18:46.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:18:46 compute-0 nova_compute[186989]: 2025-12-10 10:18:46.946 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:18:46 compute-0 nova_compute[186989]: 2025-12-10 10:18:46.947 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:18:46 compute-0 nova_compute[186989]: 2025-12-10 10:18:46.947 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:18:46 compute-0 nova_compute[186989]: 2025-12-10 10:18:46.947 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.078 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.079 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5979MB free_disk=73.36357116699219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.079 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.079 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.146 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.146 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.167 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.185 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.187 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:18:47 compute-0 nova_compute[186989]: 2025-12-10 10:18:47.188 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:18:54 compute-0 podman[212623]: 2025-12-10 10:18:54.026527653 +0000 UTC m=+0.071269770 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7)
Dec 10 10:18:57 compute-0 podman[212644]: 2025-12-10 10:18:57.052599486 +0000 UTC m=+0.086856810 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:19:09 compute-0 podman[212670]: 2025-12-10 10:19:09.024770591 +0000 UTC m=+0.066944689 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:19:11 compute-0 podman[212694]: 2025-12-10 10:19:11.02230566 +0000 UTC m=+0.065358414 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:19:17 compute-0 podman[212714]: 2025-12-10 10:19:17.047480783 +0000 UTC m=+0.075369456 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:19:17 compute-0 podman[212716]: 2025-12-10 10:19:17.069673659 +0000 UTC m=+0.094857156 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:19:17 compute-0 podman[212715]: 2025-12-10 10:19:17.07577047 +0000 UTC m=+0.104622609 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 10 10:19:25 compute-0 podman[212779]: 2025-12-10 10:19:25.050744244 +0000 UTC m=+0.080916900 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 10 10:19:28 compute-0 podman[212800]: 2025-12-10 10:19:28.024660239 +0000 UTC m=+0.057253035 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:19:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:19:31.460 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:19:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:19:31.460 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:19:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:19:31.460 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:19:40 compute-0 podman[212824]: 2025-12-10 10:19:40.014405599 +0000 UTC m=+0.063491847 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:19:42 compute-0 podman[212848]: 2025-12-10 10:19:42.025548772 +0000 UTC m=+0.058968292 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 10 10:19:43 compute-0 nova_compute[186989]: 2025-12-10 10:19:43.188 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.920 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.943 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.944 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.945 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:44 compute-0 nova_compute[186989]: 2025-12-10 10:19:44.945 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:19:45 compute-0 nova_compute[186989]: 2025-12-10 10:19:45.923 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:46 compute-0 nova_compute[186989]: 2025-12-10 10:19:46.917 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:46 compute-0 nova_compute[186989]: 2025-12-10 10:19:46.956 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:46 compute-0 nova_compute[186989]: 2025-12-10 10:19:46.956 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:46 compute-0 nova_compute[186989]: 2025-12-10 10:19:46.956 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.030 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.031 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.031 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.032 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.203 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.204 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6031MB free_disk=73.36822891235352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.204 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.205 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.272 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.273 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.297 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.311 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.312 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:19:47 compute-0 nova_compute[186989]: 2025-12-10 10:19:47.312 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:19:48 compute-0 podman[212869]: 2025-12-10 10:19:48.002686412 +0000 UTC m=+0.047295199 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 10 10:19:48 compute-0 podman[212868]: 2025-12-10 10:19:48.015596099 +0000 UTC m=+0.063678982 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 10 10:19:48 compute-0 podman[212870]: 2025-12-10 10:19:48.084510605 +0000 UTC m=+0.115197267 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 10 10:19:56 compute-0 podman[212931]: 2025-12-10 10:19:56.047844037 +0000 UTC m=+0.086701999 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 10 10:19:59 compute-0 podman[212953]: 2025-12-10 10:19:59.046556777 +0000 UTC m=+0.079993693 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:20:11 compute-0 podman[212978]: 2025-12-10 10:20:11.027567658 +0000 UTC m=+0.057155641 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:20:13 compute-0 podman[213002]: 2025-12-10 10:20:13.022520492 +0000 UTC m=+0.064389123 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 10 10:20:16 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:16.128 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:20:16 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:16.130 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:20:19 compute-0 podman[213021]: 2025-12-10 10:20:19.064588187 +0000 UTC m=+0.092436638 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 10 10:20:19 compute-0 podman[213022]: 2025-12-10 10:20:19.073116504 +0000 UTC m=+0.094313440 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 10 10:20:19 compute-0 podman[213023]: 2025-12-10 10:20:19.109813589 +0000 UTC m=+0.127542969 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:20:19 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:19.134 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:27 compute-0 podman[213086]: 2025-12-10 10:20:27.050165115 +0000 UTC m=+0.084587233 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, architecture=x86_64)
Dec 10 10:20:30 compute-0 podman[213108]: 2025-12-10 10:20:30.047923305 +0000 UTC m=+0.078027152 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:20:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:31.461 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:31.463 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:31.463 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.311 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.312 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.335 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.461 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.462 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.471 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.472 186993 INFO nova.compute.claims [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.605 186993 DEBUG nova.compute.provider_tree [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.626 186993 DEBUG nova.scheduler.client.report [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.655 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.656 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.722 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.723 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.751 186993 INFO nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.780 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.875 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.878 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.879 186993 INFO nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Creating image(s)
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.880 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.881 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.882 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.883 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:33 compute-0 nova_compute[186989]: 2025-12-10 10:20:33.883 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:34 compute-0 nova_compute[186989]: 2025-12-10 10:20:34.310 186993 WARNING oslo_policy.policy [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 10 10:20:34 compute-0 nova_compute[186989]: 2025-12-10 10:20:34.310 186993 WARNING oslo_policy.policy [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 10 10:20:34 compute-0 nova_compute[186989]: 2025-12-10 10:20:34.314 186993 DEBUG nova.policy [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.662 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.751 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.part --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.754 186993 DEBUG nova.virt.images [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] db4e7c9d-c1ff-44a9-9cd7-57ab019e9474 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.755 186993 DEBUG nova.privsep.utils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.755 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.part /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.927 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.part /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.converted" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:35 compute-0 nova_compute[186989]: 2025-12-10 10:20:35.936 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.007 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1.converted --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.009 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.035 186993 INFO oslo.privsep.daemon [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp38yna4fi/privsep.sock']
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.462 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Successfully created port: 71336fed-be63-43d7-a554-f86e2de83e54 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.749 186993 INFO oslo.privsep.daemon [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Spawned new privsep daemon via rootwrap
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.612 213152 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.619 213152 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.623 213152 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.623 213152 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213152
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.829 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.900 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.901 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.903 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.922 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.977 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:36 compute-0 nova_compute[186989]: 2025-12-10 10:20:36.979 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.013 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.016 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.017 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.071 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.072 186993 DEBUG nova.virt.disk.api [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.073 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.122 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.123 186993 DEBUG nova.virt.disk.api [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.124 186993 DEBUG nova.objects.instance [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 70a74a19-d800-4441-ae54-2289aed3ee93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.137 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.138 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Ensure instance console log exists: /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.138 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.139 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:37 compute-0 nova_compute[186989]: 2025-12-10 10:20:37.139 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.149 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Successfully updated port: 71336fed-be63-43d7-a554-f86e2de83e54 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.172 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.173 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.173 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.365 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.626 186993 DEBUG nova.compute.manager [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-changed-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.627 186993 DEBUG nova.compute.manager [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing instance network info cache due to event network-changed-71336fed-be63-43d7-a554-f86e2de83e54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:20:38 compute-0 nova_compute[186989]: 2025-12-10 10:20:38.627 186993 DEBUG oslo_concurrency.lockutils [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.580 186993 DEBUG nova.network.neutron [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.603 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.604 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Instance network_info: |[{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.605 186993 DEBUG oslo_concurrency.lockutils [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.606 186993 DEBUG nova.network.neutron [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.613 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Start _get_guest_xml network_info=[{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.622 186993 WARNING nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.637 186993 DEBUG nova.virt.libvirt.host [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.639 186993 DEBUG nova.virt.libvirt.host [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.643 186993 DEBUG nova.virt.libvirt.host [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.644 186993 DEBUG nova.virt.libvirt.host [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.645 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.646 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.646 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.647 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.647 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.648 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.648 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.648 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.649 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.649 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.650 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.651 186993 DEBUG nova.virt.hardware [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.658 186993 DEBUG nova.privsep.utils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.659 186993 DEBUG nova.virt.libvirt.vif [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1525504432',display_name='tempest-TestNetworkBasicOps-server-1525504432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1525504432',id=1,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOC6rK9F/ucDVofHFBa4F/1C0QjnIdaWZG2fWPFLbTZPf05eQ2wAiWAsLJ4rFrU4CRwA3UP4SHrJF3f+0pj0vvX7tiyE0cT3cm/SzlIJPJkQR0Xuox9T9cjPlhHFH6cC4g==',key_name='tempest-TestNetworkBasicOps-166157555',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-fgkp9kpt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:20:33Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=70a74a19-d800-4441-ae54-2289aed3ee93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.660 186993 DEBUG nova.network.os_vif_util [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.661 186993 DEBUG nova.network.os_vif_util [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.664 186993 DEBUG nova.objects.instance [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70a74a19-d800-4441-ae54-2289aed3ee93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.684 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <uuid>70a74a19-d800-4441-ae54-2289aed3ee93</uuid>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <name>instance-00000001</name>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1525504432</nova:name>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:20:39</nova:creationTime>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         <nova:port uuid="71336fed-be63-43d7-a554-f86e2de83e54">
Dec 10 10:20:39 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <system>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="serial">70a74a19-d800-4441-ae54-2289aed3ee93</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="uuid">70a74a19-d800-4441-ae54-2289aed3ee93</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </system>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <os>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </os>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <features>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </features>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.config"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:15:a7:21"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <target dev="tap71336fed-be"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/console.log" append="off"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <video>
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </video>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:20:39 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:20:39 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:20:39 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:20:39 compute-0 nova_compute[186989]: </domain>
Dec 10 10:20:39 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.686 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Preparing to wait for external event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.687 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.687 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.687 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.689 186993 DEBUG nova.virt.libvirt.vif [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1525504432',display_name='tempest-TestNetworkBasicOps-server-1525504432',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1525504432',id=1,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOC6rK9F/ucDVofHFBa4F/1C0QjnIdaWZG2fWPFLbTZPf05eQ2wAiWAsLJ4rFrU4CRwA3UP4SHrJF3f+0pj0vvX7tiyE0cT3cm/SzlIJPJkQR0Xuox9T9cjPlhHFH6cC4g==',key_name='tempest-TestNetworkBasicOps-166157555',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-fgkp9kpt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:20:33Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=70a74a19-d800-4441-ae54-2289aed3ee93,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.689 186993 DEBUG nova.network.os_vif_util [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.690 186993 DEBUG nova.network.os_vif_util [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.691 186993 DEBUG os_vif [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.754 186993 DEBUG ovsdbapp.backend.ovs_idl [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.755 186993 DEBUG ovsdbapp.backend.ovs_idl [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.755 186993 DEBUG ovsdbapp.backend.ovs_idl [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.755 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.756 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.756 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.757 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.759 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.761 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.774 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.775 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.775 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:20:39 compute-0 nova_compute[186989]: 2025-12-10 10:20:39.776 186993 INFO oslo.privsep.daemon [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpdmxhyq66/privsep.sock']
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.535 186993 INFO oslo.privsep.daemon [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Spawned new privsep daemon via rootwrap
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.412 213173 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.420 213173 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.424 213173 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.424 213173 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213173
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.854 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.855 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71336fed-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.856 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71336fed-be, col_values=(('external_ids', {'iface-id': '71336fed-be63-43d7-a554-f86e2de83e54', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:a7:21', 'vm-uuid': '70a74a19-d800-4441-ae54-2289aed3ee93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:40 compute-0 NetworkManager[55541]: <info>  [1765362040.8994] manager: (tap71336fed-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.899 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.901 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.907 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:40 compute-0 nova_compute[186989]: 2025-12-10 10:20:40.908 186993 INFO os_vif [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be')
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.067 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.068 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.068 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:15:a7:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.069 186993 INFO nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Using config drive
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.137 186993 DEBUG nova.network.neutron [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updated VIF entry in instance network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.139 186993 DEBUG nova.network.neutron [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.157 186993 DEBUG oslo_concurrency.lockutils [req-b0a348b8-d622-4d16-be7c-8461dce04746 req-13968045-3f76-4adf-b2b1-895e5c43a2d2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.531 186993 INFO nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Creating config drive at /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.config
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.537 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpukv6g44d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.678 186993 DEBUG oslo_concurrency.processutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpukv6g44d" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:41 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 10 10:20:41 compute-0 kernel: tap71336fed-be: entered promiscuous mode
Dec 10 10:20:41 compute-0 NetworkManager[55541]: <info>  [1765362041.7907] manager: (tap71336fed-be): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Dec 10 10:20:41 compute-0 ovn_controller[95452]: 2025-12-10T10:20:41Z|00027|binding|INFO|Claiming lport 71336fed-be63-43d7-a554-f86e2de83e54 for this chassis.
Dec 10 10:20:41 compute-0 ovn_controller[95452]: 2025-12-10T10:20:41Z|00028|binding|INFO|71336fed-be63-43d7-a554-f86e2de83e54: Claiming fa:16:3e:15:a7:21 10.100.0.8
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.794 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:41.807 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:a7:21 10.100.0.8'], port_security=['fa:16:3e:15:a7:21 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '51b83539-312c-4548-95d6-df26c7e14f7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2b87c41-5ca6-4aa4-a17b-506801c09033, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=71336fed-be63-43d7-a554-f86e2de83e54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:20:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:41.809 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 71336fed-be63-43d7-a554-f86e2de83e54 in datapath 77ce0e41-bb52-4715-b214-a29a8dab4ac8 bound to our chassis
Dec 10 10:20:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:41.812 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 77ce0e41-bb52-4715-b214-a29a8dab4ac8
Dec 10 10:20:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:41.814 104302 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp3_d4f87f/privsep.sock']
Dec 10 10:20:41 compute-0 systemd-udevd[213209]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:20:41 compute-0 NetworkManager[55541]: <info>  [1765362041.8596] device (tap71336fed-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:20:41 compute-0 NetworkManager[55541]: <info>  [1765362041.8604] device (tap71336fed-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.878 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:41 compute-0 ovn_controller[95452]: 2025-12-10T10:20:41Z|00029|binding|INFO|Setting lport 71336fed-be63-43d7-a554-f86e2de83e54 ovn-installed in OVS
Dec 10 10:20:41 compute-0 ovn_controller[95452]: 2025-12-10T10:20:41Z|00030|binding|INFO|Setting lport 71336fed-be63-43d7-a554-f86e2de83e54 up in Southbound
Dec 10 10:20:41 compute-0 podman[213189]: 2025-12-10 10:20:41.884449643 +0000 UTC m=+0.117045240 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:20:41 compute-0 nova_compute[186989]: 2025-12-10 10:20:41.884 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:41 compute-0 systemd-machined[153379]: New machine qemu-1-instance-00000001.
Dec 10 10:20:41 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.015 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.203 186993 DEBUG nova.compute.manager [req-9deb1d1d-f049-4eb4-968d-b7c978523919 req-8bc2d915-4917-47e1-ba10-bfa3fc9927e8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.204 186993 DEBUG oslo_concurrency.lockutils [req-9deb1d1d-f049-4eb4-968d-b7c978523919 req-8bc2d915-4917-47e1-ba10-bfa3fc9927e8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.204 186993 DEBUG oslo_concurrency.lockutils [req-9deb1d1d-f049-4eb4-968d-b7c978523919 req-8bc2d915-4917-47e1-ba10-bfa3fc9927e8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.204 186993 DEBUG oslo_concurrency.lockutils [req-9deb1d1d-f049-4eb4-968d-b7c978523919 req-8bc2d915-4917-47e1-ba10-bfa3fc9927e8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.205 186993 DEBUG nova.compute.manager [req-9deb1d1d-f049-4eb4-968d-b7c978523919 req-8bc2d915-4917-47e1-ba10-bfa3fc9927e8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Processing event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.299 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362042.2988026, 70a74a19-d800-4441-ae54-2289aed3ee93 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.300 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] VM Started (Lifecycle Event)
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.302 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.306 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.314 186993 INFO nova.virt.libvirt.driver [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Instance spawned successfully.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.315 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.345 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.349 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.374 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.375 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362042.2989292, 70a74a19-d800-4441-ae54-2289aed3ee93 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.375 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] VM Paused (Lifecycle Event)
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.379 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.379 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.380 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.380 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.380 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.381 186993 DEBUG nova.virt.libvirt.driver [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.392 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.406 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362042.3054562, 70a74a19-d800-4441-ae54-2289aed3ee93 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.406 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] VM Resumed (Lifecycle Event)
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.423 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.428 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.459 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.481 186993 INFO nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Took 8.60 seconds to spawn the instance on the hypervisor.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.482 186993 DEBUG nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.506 104302 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.506 104302 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3_d4f87f/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.364 213247 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.370 213247 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.372 213247 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.373 213247 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213247
Dec 10 10:20:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:42.509 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae028a2-191e-4867-a86c-5e27929a6140]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.540 186993 INFO nova.compute.manager [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Took 9.13 seconds to build instance.
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.562 186993 DEBUG oslo_concurrency.lockutils [None req-f104711f-0a55-443a-9f55-4fa20ed0f5de 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.940 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.940 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.940 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 10 10:20:42 compute-0 nova_compute[186989]: 2025-12-10 10:20:42.953 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.001 213247 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.001 213247 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.001 213247 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.540 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3beb0d98-2454-4dc9-89d5-99208241f50b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.541 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap77ce0e41-b1 in ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.544 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap77ce0e41-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.544 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d887e8a3-b660-42cb-943d-8adb752bb05e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.548 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[09180a02-a7a9-4044-ab03-f17ed6530f4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.574 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[06ebb65b-a018-443a-9280-c0bad3d3bb5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.596 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[19068f35-01b5-42c7-8d25-173ef36b6624]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:43.598 104302 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpcprofzqo/privsep.sock']
Dec 10 10:20:43 compute-0 podman[213256]: 2025-12-10 10:20:43.653567087 +0000 UTC m=+0.056921546 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 10 10:20:43 compute-0 nova_compute[186989]: 2025-12-10 10:20:43.967 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.306 104302 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.307 104302 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcprofzqo/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.165 213279 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.169 213279 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.171 213279 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.171 213279 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213279
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.311 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[608d11ca-effe-4e78-a292-8d0da10a7e9c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.331 186993 DEBUG nova.compute.manager [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.332 186993 DEBUG oslo_concurrency.lockutils [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.332 186993 DEBUG oslo_concurrency.lockutils [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.332 186993 DEBUG oslo_concurrency.lockutils [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.332 186993 DEBUG nova.compute.manager [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] No waiting events found dispatching network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.333 186993 WARNING nova.compute.manager [req-22d39085-ffa0-4c71-961e-a59c872f8034 req-c2726253-7a65-4b1a-9a2b-ec6fa63f7fef 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received unexpected event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 for instance with vm_state active and task_state None.
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.872 213279 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.872 213279 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:44.872 213279 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:20:44 compute-0 nova_compute[186989]: 2025-12-10 10:20:44.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.096 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.096 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.097 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.097 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70a74a19-d800-4441-ae54-2289aed3ee93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.450 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[7e11535d-0e3e-47a3-a919-70a9c91f9bf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 NetworkManager[55541]: <info>  [1765362045.4776] manager: (tap77ce0e41-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.476 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[297a9fd7-147e-42ad-8af9-c0c173828c02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.516 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[51388e83-d13f-4a03-b523-10fdb490023d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 systemd-udevd[213291]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.519 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[4464b14b-c90b-4b8c-862a-749715ade4b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 NetworkManager[55541]: <info>  [1765362045.5491] device (tap77ce0e41-b0): carrier: link connected
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.554 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[2b633305-14cf-412e-9143-883f498334c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.573 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9154514b-d1f0-402d-9fe7-aa3fe535f506]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap77ce0e41-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:fd:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 306076, 'reachable_time': 33757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213309, 'error': None, 'target': 'ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.589 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[60362c88-a15b-4470-84b8-ae1952d579c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:fda3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 306076, 'tstamp': 306076}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213310, 'error': None, 'target': 'ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.610 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bac90218-16f7-45c5-9a43-b5e82804d4f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap77ce0e41-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:fd:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 306076, 'reachable_time': 33757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213311, 'error': None, 'target': 'ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.643 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3d3cba-61fb-4a3f-b220-6e07f33a6b24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.692 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9e2825-d785-4311-ae38-a2a19305795c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.695 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77ce0e41-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.696 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.696 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77ce0e41-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:45 compute-0 NetworkManager[55541]: <info>  [1765362045.7453] manager: (tap77ce0e41-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.744 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:45 compute-0 kernel: tap77ce0e41-b0: entered promiscuous mode
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.749 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.751 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap77ce0e41-b0, col_values=(('external_ids', {'iface-id': '68a9841b-8fd5-4a35-bd5f-bbaf042a1d2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.753 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:45 compute-0 ovn_controller[95452]: 2025-12-10T10:20:45Z|00031|binding|INFO|Releasing lport 68a9841b-8fd5-4a35-bd5f-bbaf042a1d2b from this chassis (sb_readonly=0)
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.764 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.766 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/77ce0e41-bb52-4715-b214-a29a8dab4ac8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/77ce0e41-bb52-4715-b214-a29a8dab4ac8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.766 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}c0d7281e73f49d0900061a5e042086b82fb0c5749ce07364eaf781b72fb0696f" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.769 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[74557e79-4d1f-4866-8603-2f1adb2857f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.771 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-77ce0e41-bb52-4715-b214-a29a8dab4ac8
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/77ce0e41-bb52-4715-b214-a29a8dab4ac8.pid.haproxy
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 77ce0e41-bb52-4715-b214-a29a8dab4ac8
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:20:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:20:45.772 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'env', 'PROCESS_TAG=haproxy-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/77ce0e41-bb52-4715-b214-a29a8dab4ac8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.842 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Wed, 10 Dec 2025 10:20:45 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d7379c5c-8c68-48b2-8851-14a6354e7359 x-openstack-request-id: req-d7379c5c-8c68-48b2-8851-14a6354e7359 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.843 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "584a7e5a-9d03-4771-8ffa-d6e7ef17b1a9", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/584a7e5a-9d03-4771-8ffa-d6e7ef17b1a9"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/584a7e5a-9d03-4771-8ffa-d6e7ef17b1a9"}]}, {"id": "6f9bf686-c5d3-4e9c-a944-269864569e67", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.843 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d7379c5c-8c68-48b2-8851-14a6354e7359 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.846 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}c0d7281e73f49d0900061a5e042086b82fb0c5749ce07364eaf781b72fb0696f" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 10 10:20:45 compute-0 nova_compute[186989]: 2025-12-10 10:20:45.898 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.909 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 10 Dec 2025 10:20:45 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-0e289c98-e38d-4023-9dad-d2100d2d85ef x-openstack-request-id: req-0e289c98-e38d-4023-9dad-d2100d2d85ef _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.909 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "6f9bf686-c5d3-4e9c-a944-269864569e67", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.909 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/6f9bf686-c5d3-4e9c-a944-269864569e67 used request id req-0e289c98-e38d-4023-9dad-d2100d2d85ef request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.910 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'name': 'tempest-TestNetworkBasicOps-server-1525504432', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '82da19f85bb840d2a70395c3d761ef38', 'user_id': '603f9c3a99e145e4a64248329321a249', 'hostId': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>]
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.915 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 70a74a19-d800-4441-ae54-2289aed3ee93 / tap71336fed-be inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.915 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bcf6403-f650-4196-acaf-0203b68a16f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.912023', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e3f8ffcc-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '28f0dbbaf3e514f7b354b78ff0569b45eda52a0dd8c2df32c5d888ccf49d4737'}]}, 'timestamp': '2025-12-10 10:20:45.916248', '_unique_id': 'b901d33062534fdab47effc7c3d4f7f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.920 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.923 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.944 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/cpu volume: 3430000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0404b9ab-fdad-4f4e-a53e-1bdb5726310a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3430000000, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'timestamp': '2025-12-10T10:20:45.923239', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e3fd6a30-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.217619494, 'message_signature': '2ef5e325d6da5534b8236789500e8c44b13b181a24bbbca41de748ccef77757d'}]}, 'timestamp': '2025-12-10 10:20:45.945075', '_unique_id': '236ee027f65944c595eace7b4cab1069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.945 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.946 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.946 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>]
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.946 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.946 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8be03faa-5fd2-454b-96af-11c38a16843b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.946946', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e3fdc052-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': 'd2b4c908acd8eae6e169dd3e266e6e73ed0e8738ef5d4a37472ccc4f1f5c9ba3'}]}, 'timestamp': '2025-12-10 10:20:45.947212', '_unique_id': 'f31f4320109d414fa335c360a91d3c0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.947 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.948 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0277f459-2fc2-42eb-9953-d0b41d50eae7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.948315', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e3fdf4aa-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '6eeab51e0e7d1adfec438f89d82566435663e7ad4ab1d51c9c4f6bbf5fa345a6'}]}, 'timestamp': '2025-12-10 10:20:45.948548', '_unique_id': '065955bbe94d415a8fcabacd83eb6a4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.949 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9782213c-975f-47aa-b26b-ad38c783783a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.949630', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e3fe290c-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '11ceab3a811558b4841387a0e9d772acff3fc2c6755823f4325a024ab57b35bb'}]}, 'timestamp': '2025-12-10 10:20:45.949889', '_unique_id': '9f7e9909ee7346a281eaaa7d79dbdff9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.950 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.951 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>]
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.981 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.981 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44938e8f-b40e-4ebd-aedd-db715523bb89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:45.951230', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e40300bc-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': 'd18a498346db4fb74314fafbb08349c505c77500b961bab9fa48438397ea45ca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:45.951230', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4030ca6-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': 'a825b8774f7dce1b3bc108c34cadfac602742e545dc3840f298d1bceb82e99bc'}]}, 'timestamp': '2025-12-10 10:20:45.981932', '_unique_id': 'b5234c58ca784e049ee1b9b2fe5fe9aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.982 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.983 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7c7f373-5e5c-4a76-bd08-c96d371cd9ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.983805', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e4035f58-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '102894eda397bb01a40a2a4be525beeb536f9e3d9e5b8c024c8f28e6bac3a211'}]}, 'timestamp': '2025-12-10 10:20:45.984048', '_unique_id': 'e65d8712d8a142c6a8236a971d7e71fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.984 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 70a74a19-d800-4441-ae54-2289aed3ee93: ceilometer.compute.pollsters.NoVolumeException
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.985 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f38f36f0-f2a9-4401-bb52-6a62c34ffbd3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:45.985447', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4039f18-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '0aa317dabff958aa99b6c48ebbe99a9aee5794728382a89e2ac483db6f2f6666'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:45.985447', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e403a7f6-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '46ae7c41bf5249a2a8dcd8db6d36634d0050afecf1674fd6cd53137a12408b75'}]}, 'timestamp': '2025-12-10 10:20:45.985890', '_unique_id': 'dd1e2cbafdcc48f285be601400a5aa02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.986 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606cba64-fce3-42f2-865b-47c50998fd7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:45.986969', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e403da96-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '0109ebbe79e3fc532c272161319c1bb5a4ce96e6e1c6ce34539d5a7ab1589f58'}]}, 'timestamp': '2025-12-10 10:20:45.987213', '_unique_id': '82e9fc9af18a4f23be436c45aaf08d24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.987 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.997 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.998 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7005cf8-b1d3-4e62-a751-5ef2741889a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:45.988348', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4058b8e-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': 'dad89932fe3d88f0094ca33b18a369db1038ba5b1f72a8fd9462acc9d3a633f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:45.988348', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4059840-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': '26a7b4afd27f7d0e0fb687aac731e23d95fff5ba61ad8a7167910de986aa2c26'}]}, 'timestamp': '2025-12-10 10:20:45.998621', '_unique_id': '60b57a3116b24bc6aadcbed678a74ddd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:45.999 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.000 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.000 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1525504432>]
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.000 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1d7cc99-7b50-4d11-ab61-0813dd8b860f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:46.000669', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e405f416-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '6c89ae65de3948d5c9726d5eb2bcef6f83974d6601a16fb30b8a37553f6f0cbd'}]}, 'timestamp': '2025-12-10 10:20:46.000962', '_unique_id': '9cbc441aebb444c98f629e5da3390f58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.001 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.002 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.002 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.002 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18b00647-26d1-45c7-9cc9-2e5d6a291f2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.002149', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4062cd8-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': 'a8715cc65defc2ad6a5e6e67b83202d01e78d7287dc36873ea84290ef3a8186a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.002149', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4063778-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': '145a6568d1d0e9452242a3d03230aba58300023a2ad3eca11773fa9d9e38eb1a'}]}, 'timestamp': '2025-12-10 10:20:46.002681', '_unique_id': 'cdece070335648dfb6dc0285457365ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.003 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02e658a8-022f-4cbf-a6af-ec40e6839e5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:46.003900', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e4067170-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '64f595775e51c4d04907e6e7aca1a1c559226d46f884da5a893e074e42e48bcb'}]}, 'timestamp': '2025-12-10 10:20:46.004211', '_unique_id': '52c2f783ab7e4e7f857978da6968ad72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.004 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.005 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.latency volume: 139368472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.005 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.latency volume: 646447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '549cc60c-7da4-48cd-af5f-4dc407dc0856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 139368472, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.005384', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e406ab40-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '08810b2bbc4bcc2305bd27dfdd24fac983af4361e53020d8dc61550b4f1e226f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 646447, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.005384', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e406b6d0-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '4831ba75d4b04391ed5a7aadff7669719ef59ade7f0a56eb0cc694b5a27c81c0'}]}, 'timestamp': '2025-12-10 10:20:46.005940', '_unique_id': '1361f4a5460f4920aa28f397353de5c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.006 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.007 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.007 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc60336-bb71-4660-beec-7932e8d44a02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.007114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e406eede-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '9a498e52b4eab0e57a91f0bbe33c1efa449f292d955ec244391a04cc900d8110'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.007114', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e406fa64-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': 'd214bc8aa8303fdd1e35618dbd1aa820e817d2b7fee03175f2c0a804ab35536b'}]}, 'timestamp': '2025-12-10 10:20:46.007668', '_unique_id': 'afe88ded808b4dfa98451e39f787227d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.008 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85fe861b-08f4-472c-9f94-0028db01f391', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:46.008813', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e4073164-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': '9fa439d6b36acdc43d996227533833a30c9ec729d57277df459541e44d3a268a'}]}, 'timestamp': '2025-12-10 10:20:46.009121', '_unique_id': '7373300367934dcaaa6e9e94f1536e99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.009 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.010 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.010 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.010 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5931b1cb-1512-47d2-8893-0b2aac3afe94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.010456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4077174-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '4436000d63294d1ab7b78f9bd522d0724a30334b88698bf23ffa01ec1548ec8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.010456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4077d0e-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '042e3e5527201def271d9d692d6509f60a4fb568834944741800da63d8dcd863'}]}, 'timestamp': '2025-12-10 10:20:46.011010', '_unique_id': '7746a25813964d32b3d3549e10e7dde5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.011 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.012 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.012 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7652771-d62a-4fd2-970e-b016f13bcc74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.012190', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e407b512-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': '45dfa4cdbd87eb606fedea168f857cc0a0c3d1526ea84ab22768554a4a2dd537'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.012190', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e407bf94-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.261784022, 'message_signature': '8297a19b01682720e532db8f09dab53dd75e2238fa317d72236e3920f3e8fc2f'}]}, 'timestamp': '2025-12-10 10:20:46.012732', '_unique_id': '3f9cc88d915c44bbaf2e54ad5fc3fe77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.013 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68cb5a42-da37-4b31-b614-9842b6645d4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000001-70a74a19-d800-4441-ae54-2289aed3ee93-tap71336fed-be', 'timestamp': '2025-12-10T10:20:46.013934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'tap71336fed-be', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:15:a7:21', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap71336fed-be'}, 'message_id': 'e407f950-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.185486146, 'message_signature': 'fcd5b1b32b222423a38bf07ea5c73ac64bcd82ce391091ef86d2fcbdb9fc2327'}]}, 'timestamp': '2025-12-10 10:20:46.014241', '_unique_id': '2ce0c0f5fc2447b8b4146f3f7a28d957'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.014 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.015 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.015 12 DEBUG ceilometer.compute.pollsters [-] 70a74a19-d800-4441-ae54-2289aed3ee93/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cf534c4-eb66-47cb-9add-62579d94076d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-vda', 'timestamp': '2025-12-10T10:20:46.015359', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e4083140-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '62ecb0c88c46f686fbba03339792f0ba6963788683680109de0b2f0bd7d5ae22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '70a74a19-d800-4441-ae54-2289aed3ee93-sda', 'timestamp': '2025-12-10T10:20:46.015359', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1525504432', 'name': 'instance-00000001', 'instance_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e4083a28-d5b1-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3061.224643146, 'message_signature': '2b8a201972c3794ef3b7cd6686966be14a7dc8c6bdc989af634ab8c1dc1752a7'}]}, 'timestamp': '2025-12-10 10:20:46.015853', '_unique_id': '136dda6f7f674d4d91aa87c168f62509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:20:46 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:20:46.016 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:20:46 compute-0 podman[213344]: 2025-12-10 10:20:46.217988745 +0000 UTC m=+0.060660759 container create d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 10 10:20:46 compute-0 systemd[1]: Started libpod-conmon-d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22.scope.
Dec 10 10:20:46 compute-0 podman[213344]: 2025-12-10 10:20:46.182882326 +0000 UTC m=+0.025554380 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:20:46 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:20:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd7b347a8c3f6cdfae26f531cdfcfe6591c2b75daf7331ed5bace4fd422ea74b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:20:46 compute-0 podman[213344]: 2025-12-10 10:20:46.318268115 +0000 UTC m=+0.160940219 container init d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 10 10:20:46 compute-0 podman[213344]: 2025-12-10 10:20:46.323780236 +0000 UTC m=+0.166452260 container start d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 10 10:20:46 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [NOTICE]   (213364) : New worker (213366) forked
Dec 10 10:20:46 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [NOTICE]   (213364) : Loading success.
Dec 10 10:20:46 compute-0 ovn_controller[95452]: 2025-12-10T10:20:46Z|00032|binding|INFO|Releasing lport 68a9841b-8fd5-4a35-bd5f-bbaf042a1d2b from this chassis (sb_readonly=0)
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9024] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9029] device (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <warn>  [1765362046.9032] device (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9038] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9040] device (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <warn>  [1765362046.9041] device (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9047] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9052] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9056] device (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 10 10:20:46 compute-0 NetworkManager[55541]: <info>  [1765362046.9060] device (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 10 10:20:46 compute-0 nova_compute[186989]: 2025-12-10 10:20:46.912 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:46 compute-0 ovn_controller[95452]: 2025-12-10T10:20:46Z|00033|binding|INFO|Releasing lport 68a9841b-8fd5-4a35-bd5f-bbaf042a1d2b from this chassis (sb_readonly=0)
Dec 10 10:20:46 compute-0 nova_compute[186989]: 2025-12-10 10:20:46.931 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:46 compute-0 nova_compute[186989]: 2025-12-10 10:20:46.935 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:47 compute-0 nova_compute[186989]: 2025-12-10 10:20:47.017 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.177 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.205 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.206 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.206 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.206 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.206 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.207 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.207 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.207 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.207 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.235 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.236 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.237 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.237 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.252 186993 DEBUG nova.compute.manager [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-changed-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.252 186993 DEBUG nova.compute.manager [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing instance network info cache due to event network-changed-71336fed-be63-43d7-a554-f86e2de83e54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.253 186993 DEBUG oslo_concurrency.lockutils [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.253 186993 DEBUG oslo_concurrency.lockutils [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.254 186993 DEBUG nova.network.neutron [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.345 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.416 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.417 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.473 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.663 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.665 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5577MB free_disk=73.33304214477539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.665 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.665 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.793 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 70a74a19-d800-4441-ae54-2289aed3ee93 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.794 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.794 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.845 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing inventories for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.941 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating ProviderTree inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.941 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.967 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing aggregate associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 10 10:20:48 compute-0 nova_compute[186989]: 2025-12-10 10:20:48.995 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing trait associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.049 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.092 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updated inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.093 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.093 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.115 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:20:49 compute-0 nova_compute[186989]: 2025-12-10 10:20:49.116 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:20:50 compute-0 podman[213384]: 2025-12-10 10:20:50.024823045 +0000 UTC m=+0.062860889 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd)
Dec 10 10:20:50 compute-0 podman[213383]: 2025-12-10 10:20:50.047904265 +0000 UTC m=+0.085812016 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 10 10:20:50 compute-0 podman[213385]: 2025-12-10 10:20:50.057186519 +0000 UTC m=+0.089200199 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 10 10:20:50 compute-0 nova_compute[186989]: 2025-12-10 10:20:50.119 186993 DEBUG nova.network.neutron [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updated VIF entry in instance network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:20:50 compute-0 nova_compute[186989]: 2025-12-10 10:20:50.120 186993 DEBUG nova.network.neutron [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:20:50 compute-0 nova_compute[186989]: 2025-12-10 10:20:50.140 186993 DEBUG oslo_concurrency.lockutils [req-da24db5a-5fec-4916-959a-accb6151bcc5 req-ae2cab6d-dfef-4a8a-9bbf-d900fb86e9ca 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:20:50 compute-0 nova_compute[186989]: 2025-12-10 10:20:50.900 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:51 compute-0 nova_compute[186989]: 2025-12-10 10:20:51.112 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:20:52 compute-0 nova_compute[186989]: 2025-12-10 10:20:52.019 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:53 compute-0 ovn_controller[95452]: 2025-12-10T10:20:53Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:a7:21 10.100.0.8
Dec 10 10:20:53 compute-0 ovn_controller[95452]: 2025-12-10T10:20:53Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:a7:21 10.100.0.8
Dec 10 10:20:55 compute-0 nova_compute[186989]: 2025-12-10 10:20:55.901 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:57 compute-0 nova_compute[186989]: 2025-12-10 10:20:57.022 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:20:58 compute-0 podman[213464]: 2025-12-10 10:20:58.07754466 +0000 UTC m=+0.108062493 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64)
Dec 10 10:21:00 compute-0 nova_compute[186989]: 2025-12-10 10:21:00.849 186993 INFO nova.compute.manager [None req-4c4a467a-955a-4dc1-8e00-e38d0fe95416 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Get console output
Dec 10 10:21:00 compute-0 nova_compute[186989]: 2025-12-10 10:21:00.903 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:00 compute-0 nova_compute[186989]: 2025-12-10 10:21:00.955 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:21:01 compute-0 podman[213485]: 2025-12-10 10:21:01.03826568 +0000 UTC m=+0.077918811 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:21:02 compute-0 nova_compute[186989]: 2025-12-10 10:21:02.025 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:05 compute-0 nova_compute[186989]: 2025-12-10 10:21:05.905 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:07 compute-0 nova_compute[186989]: 2025-12-10 10:21:07.028 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:10 compute-0 nova_compute[186989]: 2025-12-10 10:21:10.908 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.389 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.390 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.409 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.480 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.480 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.489 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.489 186993 INFO nova.compute.claims [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.646 186993 DEBUG nova.compute.provider_tree [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.668 186993 DEBUG nova.scheduler.client.report [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.687 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.688 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.739 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.739 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:21:11 compute-0 nova_compute[186989]: 2025-12-10 10:21:11.766 186993 INFO nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.030 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:12 compute-0 podman[213509]: 2025-12-10 10:21:12.047583563 +0000 UTC m=+0.081654113 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.113 186993 DEBUG nova.policy [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.169 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.351 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.353 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.354 186993 INFO nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Creating image(s)
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.355 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.356 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.357 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.384 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.480 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.482 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.483 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.512 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.584 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.585 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.630 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.632 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.633 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.706 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.708 186993 DEBUG nova.virt.disk.api [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.709 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.764 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.766 186993 DEBUG nova.virt.disk.api [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.767 186993 DEBUG nova.objects.instance [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid e684ab58-5cc5-41f8-8460-b90ff9621838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.782 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.783 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Ensure instance console log exists: /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.783 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.784 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:12 compute-0 nova_compute[186989]: 2025-12-10 10:21:12.784 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.177 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Successfully created port: 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.784 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Successfully updated port: 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.801 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.802 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.802 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.866 186993 DEBUG nova.compute.manager [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-changed-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.867 186993 DEBUG nova.compute.manager [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Refreshing instance network info cache due to event network-changed-7dd7e1d8-ef9a-4b68-a771-5d978fb90732. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:21:13 compute-0 nova_compute[186989]: 2025-12-10 10:21:13.868 186993 DEBUG oslo_concurrency.lockutils [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:21:14 compute-0 podman[213549]: 2025-12-10 10:21:14.052173262 +0000 UTC m=+0.085713043 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.096 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.722 186993 DEBUG nova.network.neutron [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Updating instance_info_cache with network_info: [{"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.740 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.741 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Instance network_info: |[{"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.741 186993 DEBUG oslo_concurrency.lockutils [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.741 186993 DEBUG nova.network.neutron [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Refreshing network info cache for port 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.743 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Start _get_guest_xml network_info=[{"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.748 186993 WARNING nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.753 186993 DEBUG nova.virt.libvirt.host [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.753 186993 DEBUG nova.virt.libvirt.host [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.757 186993 DEBUG nova.virt.libvirt.host [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.757 186993 DEBUG nova.virt.libvirt.host [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.758 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.758 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.758 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.758 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.759 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.759 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.759 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.759 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.759 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.760 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.760 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.760 186993 DEBUG nova.virt.hardware [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.763 186993 DEBUG nova.virt.libvirt.vif [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-751769766',display_name='tempest-TestNetworkBasicOps-server-751769766',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-751769766',id=2,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBILp5bOzIvOOD8URqP8ws4ggLlFMm6XUp/qh9hFPIl4S58Nra9wEW9/tevWVrql5neDAER+HdRbRvcloVZC/eckigHNzdGjYIw+gVITybqUw8LEbmC6+cycTPlWeg8NpPg==',key_name='tempest-TestNetworkBasicOps-1189253526',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-7zk2rx3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:21:12Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e684ab58-5cc5-41f8-8460-b90ff9621838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.763 186993 DEBUG nova.network.os_vif_util [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.764 186993 DEBUG nova.network.os_vif_util [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.765 186993 DEBUG nova.objects.instance [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid e684ab58-5cc5-41f8-8460-b90ff9621838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.776 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <uuid>e684ab58-5cc5-41f8-8460-b90ff9621838</uuid>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <name>instance-00000002</name>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-751769766</nova:name>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:21:14</nova:creationTime>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         <nova:port uuid="7dd7e1d8-ef9a-4b68-a771-5d978fb90732">
Dec 10 10:21:14 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <system>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="serial">e684ab58-5cc5-41f8-8460-b90ff9621838</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="uuid">e684ab58-5cc5-41f8-8460-b90ff9621838</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </system>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <os>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </os>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <features>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </features>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.config"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:ef:ac:e8"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <target dev="tap7dd7e1d8-ef"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/console.log" append="off"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <video>
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </video>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:21:14 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:21:14 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:21:14 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:21:14 compute-0 nova_compute[186989]: </domain>
Dec 10 10:21:14 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.777 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Preparing to wait for external event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.777 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.777 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.777 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.778 186993 DEBUG nova.virt.libvirt.vif [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-751769766',display_name='tempest-TestNetworkBasicOps-server-751769766',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-751769766',id=2,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBILp5bOzIvOOD8URqP8ws4ggLlFMm6XUp/qh9hFPIl4S58Nra9wEW9/tevWVrql5neDAER+HdRbRvcloVZC/eckigHNzdGjYIw+gVITybqUw8LEbmC6+cycTPlWeg8NpPg==',key_name='tempest-TestNetworkBasicOps-1189253526',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-7zk2rx3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:21:12Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e684ab58-5cc5-41f8-8460-b90ff9621838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.778 186993 DEBUG nova.network.os_vif_util [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.778 186993 DEBUG nova.network.os_vif_util [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.779 186993 DEBUG os_vif [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.779 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.780 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.780 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.785 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.785 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7dd7e1d8-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.786 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7dd7e1d8-ef, col_values=(('external_ids', {'iface-id': '7dd7e1d8-ef9a-4b68-a771-5d978fb90732', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:ac:e8', 'vm-uuid': 'e684ab58-5cc5-41f8-8460-b90ff9621838'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.787 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:14 compute-0 NetworkManager[55541]: <info>  [1765362074.7885] manager: (tap7dd7e1d8-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.789 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.830 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.833 186993 INFO os_vif [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef')
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.901 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.903 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.903 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:ef:ac:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:21:14 compute-0 nova_compute[186989]: 2025-12-10 10:21:14.904 186993 INFO nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Using config drive
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.279 186993 INFO nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Creating config drive at /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.config
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.284 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwi_d0htv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.420 186993 DEBUG oslo_concurrency.processutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwi_d0htv" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.4885] manager: (tap7dd7e1d8-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Dec 10 10:21:15 compute-0 kernel: tap7dd7e1d8-ef: entered promiscuous mode
Dec 10 10:21:15 compute-0 ovn_controller[95452]: 2025-12-10T10:21:15Z|00034|binding|INFO|Claiming lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 for this chassis.
Dec 10 10:21:15 compute-0 ovn_controller[95452]: 2025-12-10T10:21:15Z|00035|binding|INFO|7dd7e1d8-ef9a-4b68-a771-5d978fb90732: Claiming fa:16:3e:ef:ac:e8 10.100.0.29
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.492 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.502 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ac:e8 10.100.0.29'], port_security=['fa:16:3e:ef:ac:e8 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'e684ab58-5cc5-41f8-8460-b90ff9621838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8d11afc-ce42-4557-a831-96c90958b58c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '769d614f-7880-4c63-a7e0-fa18e62e6f4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ec9c22-f11a-4b69-9218-54988741fe3a, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=7dd7e1d8-ef9a-4b68-a771-5d978fb90732) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.505 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 in datapath e8d11afc-ce42-4557-a831-96c90958b58c bound to our chassis
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.506 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8d11afc-ce42-4557-a831-96c90958b58c
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.524 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[89dfc05f-10ba-41ff-ad39-f6dcb49cc7f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.524 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape8d11afc-c1 in ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.527 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape8d11afc-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.527 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa3963a-afec-423b-8166-382f83588cc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.528 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4d29a698-cb02-412d-8573-890a80376fd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 systemd-machined[153379]: New machine qemu-2-instance-00000002.
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.550 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 ovn_controller[95452]: 2025-12-10T10:21:15Z|00036|binding|INFO|Setting lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 ovn-installed in OVS
Dec 10 10:21:15 compute-0 ovn_controller[95452]: 2025-12-10T10:21:15Z|00037|binding|INFO|Setting lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 up in Southbound
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.555 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.557 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0059e0-1480-4904-9c52-3b00aac9a5c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 10 10:21:15 compute-0 systemd-udevd[213592]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.583 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d479b252-e831-4ff4-814f-25a584a48fd1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.5908] device (tap7dd7e1d8-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.5920] device (tap7dd7e1d8-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.617 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[3e61bb96-b84a-4c52-9a67-628afeae0e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.624 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[45308d5d-db9a-4678-b110-88fb7f9fe8bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.6263] manager: (tape8d11afc-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.661 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[52dc0d34-897d-4f17-bf91-29f0f34943e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.663 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9e217f-823c-42e3-83eb-1fa0cf22b3c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.6942] device (tape8d11afc-c0): carrier: link connected
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.702 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[319192d3-cd36-4aa5-a0ef-d2013c8b1ebd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.729 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[71de3875-57d6-4fdd-a9d2-5c5731908389]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8d11afc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:b9:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309091, 'reachable_time': 19057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213627, 'error': None, 'target': 'ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.751 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5f26c57d-b4f4-4b9c-b7c8-20ccc0bd6921]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:b92a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 309091, 'tstamp': 309091}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213630, 'error': None, 'target': 'ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.775 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[13238df7-ec13-4bef-af5f-eb52834769f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8d11afc-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:b9:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309091, 'reachable_time': 19057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213631, 'error': None, 'target': 'ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.819 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[198989e2-79b5-494c-ab72-4b8c6eff51b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.823 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362075.8226457, e684ab58-5cc5-41f8-8460-b90ff9621838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.824 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] VM Started (Lifecycle Event)
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.887 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[1976a8cf-a94a-478c-93c6-85acde3b8013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.890 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8d11afc-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.890 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.891 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8d11afc-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:15 compute-0 kernel: tape8d11afc-c0: entered promiscuous mode
Dec 10 10:21:15 compute-0 NetworkManager[55541]: <info>  [1765362075.9272] manager: (tape8d11afc-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.927 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.930 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8d11afc-c0, col_values=(('external_ids', {'iface-id': '290ef888-91d0-452a-928a-e1b265008f07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.931 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:21:15 compute-0 ovn_controller[95452]: 2025-12-10T10:21:15Z|00038|binding|INFO|Releasing lport 290ef888-91d0-452a-928a-e1b265008f07 from this chassis (sb_readonly=0)
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.932 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.938 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362075.826067, e684ab58-5cc5-41f8-8460-b90ff9621838 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.938 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] VM Paused (Lifecycle Event)
Dec 10 10:21:15 compute-0 nova_compute[186989]: 2025-12-10 10:21:15.958 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.960 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e8d11afc-ce42-4557-a831-96c90958b58c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e8d11afc-ce42-4557-a831-96c90958b58c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.961 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[2338e47e-3319-491f-be94-f8386ab928ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.962 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-e8d11afc-ce42-4557-a831-96c90958b58c
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/e8d11afc-ce42-4557-a831-96c90958b58c.pid.haproxy
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID e8d11afc-ce42-4557-a831-96c90958b58c
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:21:15 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:15.963 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c', 'env', 'PROCESS_TAG=haproxy-e8d11afc-ce42-4557-a831-96c90958b58c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e8d11afc-ce42-4557-a831-96c90958b58c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.012 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.017 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.071 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.088 186993 DEBUG nova.compute.manager [req-e4b8f7aa-b72c-46c5-a1de-157807e3d6bf req-82653df1-03a4-4ca1-87c0-eddc73cd84b2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.089 186993 DEBUG oslo_concurrency.lockutils [req-e4b8f7aa-b72c-46c5-a1de-157807e3d6bf req-82653df1-03a4-4ca1-87c0-eddc73cd84b2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.089 186993 DEBUG oslo_concurrency.lockutils [req-e4b8f7aa-b72c-46c5-a1de-157807e3d6bf req-82653df1-03a4-4ca1-87c0-eddc73cd84b2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.090 186993 DEBUG oslo_concurrency.lockutils [req-e4b8f7aa-b72c-46c5-a1de-157807e3d6bf req-82653df1-03a4-4ca1-87c0-eddc73cd84b2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.090 186993 DEBUG nova.compute.manager [req-e4b8f7aa-b72c-46c5-a1de-157807e3d6bf req-82653df1-03a4-4ca1-87c0-eddc73cd84b2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Processing event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.091 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.096 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362076.0964868, e684ab58-5cc5-41f8-8460-b90ff9621838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.097 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] VM Resumed (Lifecycle Event)
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.099 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.103 186993 INFO nova.virt.libvirt.driver [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Instance spawned successfully.
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.104 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.113 186993 DEBUG nova.network.neutron [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Updated VIF entry in instance network info cache for port 7dd7e1d8-ef9a-4b68-a771-5d978fb90732. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.114 186993 DEBUG nova.network.neutron [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Updating instance_info_cache with network_info: [{"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.149 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.149 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.150 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.150 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.151 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.151 186993 DEBUG nova.virt.libvirt.driver [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.162 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.166 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.222 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.317 186993 INFO nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Took 3.97 seconds to spawn the instance on the hypervisor.
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.318 186993 DEBUG nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.357 186993 DEBUG oslo_concurrency.lockutils [req-07b5a827-0945-4783-be08-3bd914e84870 req-69a0e4bb-7bed-4a3d-a4be-0e4996975e90 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-e684ab58-5cc5-41f8-8460-b90ff9621838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:21:16 compute-0 podman[213664]: 2025-12-10 10:21:16.378241397 +0000 UTC m=+0.060840354 container create b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.391 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:16 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:16.393 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:21:16 compute-0 systemd[1]: Started libpod-conmon-b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b.scope.
Dec 10 10:21:16 compute-0 podman[213664]: 2025-12-10 10:21:16.344005121 +0000 UTC m=+0.026604188 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:21:16 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:21:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61bfdcff463d2984f20f2d7ee0b398e6ec3692f09dc1a6d4c256478f0ff483d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:21:16 compute-0 podman[213664]: 2025-12-10 10:21:16.471427303 +0000 UTC m=+0.154026300 container init b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 10 10:21:16 compute-0 podman[213664]: 2025-12-10 10:21:16.479437462 +0000 UTC m=+0.162036439 container start b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:21:16 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [NOTICE]   (213684) : New worker (213686) forked
Dec 10 10:21:16 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [NOTICE]   (213684) : Loading success.
Dec 10 10:21:16 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:16.541 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.573 186993 INFO nova.compute.manager [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Took 5.12 seconds to build instance.
Dec 10 10:21:16 compute-0 nova_compute[186989]: 2025-12-10 10:21:16.691 186993 DEBUG oslo_concurrency.lockutils [None req-613c061a-6a3a-48f9-a94a-75984252408e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:17 compute-0 nova_compute[186989]: 2025-12-10 10:21:17.066 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.193 186993 DEBUG nova.compute.manager [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.195 186993 DEBUG oslo_concurrency.lockutils [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.195 186993 DEBUG oslo_concurrency.lockutils [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.196 186993 DEBUG oslo_concurrency.lockutils [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.197 186993 DEBUG nova.compute.manager [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] No waiting events found dispatching network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:21:18 compute-0 nova_compute[186989]: 2025-12-10 10:21:18.198 186993 WARNING nova.compute.manager [req-e669ae6d-544b-4b58-aa4a-4a77b63e21c4 req-64916ddb-f6fd-4686-96b8-cc4e64414157 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received unexpected event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 for instance with vm_state active and task_state None.
Dec 10 10:21:19 compute-0 nova_compute[186989]: 2025-12-10 10:21:19.788 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:21 compute-0 podman[213697]: 2025-12-10 10:21:21.049420387 +0000 UTC m=+0.081193190 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:21:21 compute-0 podman[213698]: 2025-12-10 10:21:21.064613232 +0000 UTC m=+0.088341535 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 10 10:21:21 compute-0 podman[213699]: 2025-12-10 10:21:21.091437675 +0000 UTC m=+0.110598884 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 10 10:21:21 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:21.543 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:22 compute-0 nova_compute[186989]: 2025-12-10 10:21:22.069 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:24 compute-0 sshd-session[213695]: Connection reset by 205.210.31.36 port 63890 [preauth]
Dec 10 10:21:24 compute-0 nova_compute[186989]: 2025-12-10 10:21:24.789 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:27 compute-0 nova_compute[186989]: 2025-12-10 10:21:27.097 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:29 compute-0 podman[213781]: 2025-12-10 10:21:29.040177039 +0000 UTC m=+0.076281046 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 10 10:21:29 compute-0 ovn_controller[95452]: 2025-12-10T10:21:29Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:ac:e8 10.100.0.29
Dec 10 10:21:29 compute-0 ovn_controller[95452]: 2025-12-10T10:21:29Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:ac:e8 10.100.0.29
Dec 10 10:21:29 compute-0 nova_compute[186989]: 2025-12-10 10:21:29.797 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:31.462 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:31.463 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:31.464 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:32 compute-0 podman[213800]: 2025-12-10 10:21:32.051030601 +0000 UTC m=+0.085482527 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:21:32 compute-0 nova_compute[186989]: 2025-12-10 10:21:32.101 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:34 compute-0 nova_compute[186989]: 2025-12-10 10:21:34.811 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:37 compute-0 nova_compute[186989]: 2025-12-10 10:21:37.112 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:39 compute-0 nova_compute[186989]: 2025-12-10 10:21:39.816 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.625 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.627 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.628 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.628 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.629 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.631 186993 INFO nova.compute.manager [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Terminating instance
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.633 186993 DEBUG nova.compute.manager [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:21:40 compute-0 kernel: tap7dd7e1d8-ef (unregistering): left promiscuous mode
Dec 10 10:21:40 compute-0 NetworkManager[55541]: <info>  [1765362100.6650] device (tap7dd7e1d8-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00039|binding|INFO|Releasing lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 from this chassis (sb_readonly=0)
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00040|binding|INFO|Setting lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 down in Southbound
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.678 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00041|binding|INFO|Removing iface tap7dd7e1d8-ef ovn-installed in OVS
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.682 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.705 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 10 10:21:40 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.125s CPU time.
Dec 10 10:21:40 compute-0 systemd-machined[153379]: Machine qemu-2-instance-00000002 terminated.
Dec 10 10:21:40 compute-0 kernel: tap7dd7e1d8-ef: entered promiscuous mode
Dec 10 10:21:40 compute-0 kernel: tap7dd7e1d8-ef (unregistering): left promiscuous mode
Dec 10 10:21:40 compute-0 NetworkManager[55541]: <info>  [1765362100.8657] manager: (tap7dd7e1d8-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00042|if_status|INFO|Not updating pb chassis for 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 now as sb is readonly
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.869 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00043|binding|INFO|Releasing lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 from this chassis (sb_readonly=1)
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.902 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 ovn_controller[95452]: 2025-12-10T10:21:40Z|00044|if_status|INFO|Not setting lport 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 down as sb is readonly
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.916 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.935 186993 INFO nova.virt.libvirt.driver [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Instance destroyed successfully.
Dec 10 10:21:40 compute-0 nova_compute[186989]: 2025-12-10 10:21:40.935 186993 DEBUG nova.objects.instance [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid e684ab58-5cc5-41f8-8460-b90ff9621838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.043 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:ac:e8 10.100.0.29'], port_security=['fa:16:3e:ef:ac:e8 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'e684ab58-5cc5-41f8-8460-b90ff9621838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8d11afc-ce42-4557-a831-96c90958b58c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '769d614f-7880-4c63-a7e0-fa18e62e6f4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ec9c22-f11a-4b69-9218-54988741fe3a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=7dd7e1d8-ef9a-4b68-a771-5d978fb90732) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.045 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 7dd7e1d8-ef9a-4b68-a771-5d978fb90732 in datapath e8d11afc-ce42-4557-a831-96c90958b58c unbound from our chassis
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.047 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8d11afc-ce42-4557-a831-96c90958b58c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.049 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9c43dd-9a29-4777-b595-e00704073f17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.050 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c namespace which is not needed anymore
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.054 186993 DEBUG nova.virt.libvirt.vif [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-751769766',display_name='tempest-TestNetworkBasicOps-server-751769766',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-751769766',id=2,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBILp5bOzIvOOD8URqP8ws4ggLlFMm6XUp/qh9hFPIl4S58Nra9wEW9/tevWVrql5neDAER+HdRbRvcloVZC/eckigHNzdGjYIw+gVITybqUw8LEbmC6+cycTPlWeg8NpPg==',key_name='tempest-TestNetworkBasicOps-1189253526',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:21:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-7zk2rx3p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:21:16Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e684ab58-5cc5-41f8-8460-b90ff9621838,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.055 186993 DEBUG nova.network.os_vif_util [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "address": "fa:16:3e:ef:ac:e8", "network": {"id": "e8d11afc-ce42-4557-a831-96c90958b58c", "bridge": "br-int", "label": "tempest-network-smoke--1290461304", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7dd7e1d8-ef", "ovs_interfaceid": "7dd7e1d8-ef9a-4b68-a771-5d978fb90732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.058 186993 DEBUG nova.network.os_vif_util [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.059 186993 DEBUG os_vif [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.066 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.067 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7dd7e1d8-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.070 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.073 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.080 186993 INFO os_vif [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:ac:e8,bridge_name='br-int',has_traffic_filtering=True,id=7dd7e1d8-ef9a-4b68-a771-5d978fb90732,network=Network(e8d11afc-ce42-4557-a831-96c90958b58c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7dd7e1d8-ef')
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.081 186993 INFO nova.virt.libvirt.driver [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Deleting instance files /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838_del
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.082 186993 INFO nova.virt.libvirt.driver [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Deletion of /var/lib/nova/instances/e684ab58-5cc5-41f8-8460-b90ff9621838_del complete
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.168 186993 DEBUG nova.virt.libvirt.host [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.169 186993 INFO nova.virt.libvirt.host [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] UEFI support detected
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.173 186993 INFO nova.compute.manager [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Took 0.54 seconds to destroy the instance on the hypervisor.
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.173 186993 DEBUG oslo.service.loopingcall [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.174 186993 DEBUG nova.compute.manager [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.174 186993 DEBUG nova.network.neutron [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:21:41 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [NOTICE]   (213684) : haproxy version is 2.8.14-c23fe91
Dec 10 10:21:41 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [NOTICE]   (213684) : path to executable is /usr/sbin/haproxy
Dec 10 10:21:41 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [WARNING]  (213684) : Exiting Master process...
Dec 10 10:21:41 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [ALERT]    (213684) : Current worker (213686) exited with code 143 (Terminated)
Dec 10 10:21:41 compute-0 neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c[213680]: [WARNING]  (213684) : All workers exited. Exiting... (0)
Dec 10 10:21:41 compute-0 systemd[1]: libpod-b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b.scope: Deactivated successfully.
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.256 186993 DEBUG nova.compute.manager [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-unplugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.258 186993 DEBUG oslo_concurrency.lockutils [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.258 186993 DEBUG oslo_concurrency.lockutils [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.259 186993 DEBUG oslo_concurrency.lockutils [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.259 186993 DEBUG nova.compute.manager [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] No waiting events found dispatching network-vif-unplugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.260 186993 DEBUG nova.compute.manager [req-38069cc4-05b9-43cb-8e15-dc15332220d3 req-6113fa2b-d961-41af-8232-690e4ad6445e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-unplugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:21:41 compute-0 podman[213869]: 2025-12-10 10:21:41.261403627 +0000 UTC m=+0.063134366 container died b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:21:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b-userdata-shm.mount: Deactivated successfully.
Dec 10 10:21:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-61bfdcff463d2984f20f2d7ee0b398e6ec3692f09dc1a6d4c256478f0ff483d2-merged.mount: Deactivated successfully.
Dec 10 10:21:41 compute-0 podman[213869]: 2025-12-10 10:21:41.31636615 +0000 UTC m=+0.118096939 container cleanup b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:21:41 compute-0 systemd[1]: libpod-conmon-b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b.scope: Deactivated successfully.
Dec 10 10:21:41 compute-0 podman[213900]: 2025-12-10 10:21:41.413654908 +0000 UTC m=+0.060300489 container remove b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.422 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8d98bf-6d28-4505-a333-cbc8d491fe83]: (4, ('Wed Dec 10 10:21:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c (b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b)\nb896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b\nWed Dec 10 10:21:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c (b896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b)\nb896e3f522858f1f4fcd586e82bed9080710fb054f5209d035ff3d76bb62588b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.424 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[df08e5f2-de1b-4dc1-8bbe-8f5f6928be02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.426 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8d11afc-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.429 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:41 compute-0 kernel: tape8d11afc-c0: left promiscuous mode
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.444 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.449 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bb01f781-c522-4be7-898f-2c3df82c96b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.475 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[36c4eb73-2e05-4c95-9940-8742706d4bd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.476 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5e04d1-b349-46b1-8d08-076debab9e33]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.500 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f3451308-4e4b-43ba-81a4-f5bda387156e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309082, 'reachable_time': 26378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213915, 'error': None, 'target': 'ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 systemd[1]: run-netns-ovnmeta\x2de8d11afc\x2dce42\x2d4557\x2da831\x2d96c90958b58c.mount: Deactivated successfully.
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.515 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e8d11afc-ce42-4557-a831-96c90958b58c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:21:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:41.517 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[81bac315-a7ab-4887-afd9-350d56d9de51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.771 186993 DEBUG nova.network.neutron [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.792 186993 INFO nova.compute.manager [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Took 0.62 seconds to deallocate network for instance.
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.837 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.838 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.842 186993 DEBUG nova.compute.manager [req-077220a7-f21e-4ce8-a128-35274d3ddfa6 req-d821ba8d-d68d-4d50-91e7-ebabe499498c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-deleted-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.926 186993 DEBUG nova.compute.provider_tree [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.944 186993 DEBUG nova.scheduler.client.report [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.968 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:41 compute-0 nova_compute[186989]: 2025-12-10 10:21:41.992 186993 INFO nova.scheduler.client.report [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance e684ab58-5cc5-41f8-8460-b90ff9621838
Dec 10 10:21:42 compute-0 nova_compute[186989]: 2025-12-10 10:21:42.077 186993 DEBUG oslo_concurrency.lockutils [None req-e370fcb3-5e61-41fe-a0ef-727922b0d650 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.450s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:42 compute-0 nova_compute[186989]: 2025-12-10 10:21:42.115 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:43 compute-0 podman[213917]: 2025-12-10 10:21:43.074055179 +0000 UTC m=+0.097376992 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.332 186993 DEBUG nova.compute.manager [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.333 186993 DEBUG oslo_concurrency.lockutils [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.333 186993 DEBUG oslo_concurrency.lockutils [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.334 186993 DEBUG oslo_concurrency.lockutils [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e684ab58-5cc5-41f8-8460-b90ff9621838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.334 186993 DEBUG nova.compute.manager [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] No waiting events found dispatching network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:21:43 compute-0 nova_compute[186989]: 2025-12-10 10:21:43.334 186993 WARNING nova.compute.manager [req-80fe6f21-915e-433b-966e-60cf8c68d8b6 req-0b7abeb2-3739-480a-a1b0-a25443f5970b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Received unexpected event network-vif-plugged-7dd7e1d8-ef9a-4b68-a771-5d978fb90732 for instance with vm_state deleted and task_state None.
Dec 10 10:21:44 compute-0 ovn_controller[95452]: 2025-12-10T10:21:44Z|00045|binding|INFO|Releasing lport 68a9841b-8fd5-4a35-bd5f-bbaf042a1d2b from this chassis (sb_readonly=0)
Dec 10 10:21:44 compute-0 nova_compute[186989]: 2025-12-10 10:21:44.491 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:44 compute-0 nova_compute[186989]: 2025-12-10 10:21:44.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:45 compute-0 podman[213942]: 2025-12-10 10:21:45.041810868 +0000 UTC m=+0.069328505 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.536 186993 DEBUG nova.compute.manager [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-changed-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.537 186993 DEBUG nova.compute.manager [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing instance network info cache due to event network-changed-71336fed-be63-43d7-a554-f86e2de83e54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.537 186993 DEBUG oslo_concurrency.lockutils [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.537 186993 DEBUG oslo_concurrency.lockutils [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.538 186993 DEBUG nova.network.neutron [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Refreshing network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.662 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.663 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.663 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.664 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.664 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.665 186993 INFO nova.compute.manager [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Terminating instance
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.666 186993 DEBUG nova.compute.manager [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:21:45 compute-0 kernel: tap71336fed-be (unregistering): left promiscuous mode
Dec 10 10:21:45 compute-0 NetworkManager[55541]: <info>  [1765362105.7069] device (tap71336fed-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:21:45 compute-0 ovn_controller[95452]: 2025-12-10T10:21:45Z|00046|binding|INFO|Releasing lport 71336fed-be63-43d7-a554-f86e2de83e54 from this chassis (sb_readonly=0)
Dec 10 10:21:45 compute-0 ovn_controller[95452]: 2025-12-10T10:21:45Z|00047|binding|INFO|Setting lport 71336fed-be63-43d7-a554-f86e2de83e54 down in Southbound
Dec 10 10:21:45 compute-0 ovn_controller[95452]: 2025-12-10T10:21:45Z|00048|binding|INFO|Removing iface tap71336fed-be ovn-installed in OVS
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.708 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:45.715 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:a7:21 10.100.0.8'], port_security=['fa:16:3e:15:a7:21 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '70a74a19-d800-4441-ae54-2289aed3ee93', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '51b83539-312c-4548-95d6-df26c7e14f7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2b87c41-5ca6-4aa4-a17b-506801c09033, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=71336fed-be63-43d7-a554-f86e2de83e54) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:21:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:45.716 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 71336fed-be63-43d7-a554-f86e2de83e54 in datapath 77ce0e41-bb52-4715-b214-a29a8dab4ac8 unbound from our chassis
Dec 10 10:21:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:45.717 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 77ce0e41-bb52-4715-b214-a29a8dab4ac8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:21:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:45.718 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5738f6c2-493f-4a5f-9bb5-a57d00922658]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:45.718 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8 namespace which is not needed anymore
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.727 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:45 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 10 10:21:45 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.284s CPU time.
Dec 10 10:21:45 compute-0 systemd-machined[153379]: Machine qemu-1-instance-00000001 terminated.
Dec 10 10:21:45 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [NOTICE]   (213364) : haproxy version is 2.8.14-c23fe91
Dec 10 10:21:45 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [NOTICE]   (213364) : path to executable is /usr/sbin/haproxy
Dec 10 10:21:45 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [WARNING]  (213364) : Exiting Master process...
Dec 10 10:21:45 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [ALERT]    (213364) : Current worker (213366) exited with code 143 (Terminated)
Dec 10 10:21:45 compute-0 neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8[213360]: [WARNING]  (213364) : All workers exited. Exiting... (0)
Dec 10 10:21:45 compute-0 systemd[1]: libpod-d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22.scope: Deactivated successfully.
Dec 10 10:21:45 compute-0 conmon[213360]: conmon d4546fa63c3634a0e91a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22.scope/container/memory.events
Dec 10 10:21:45 compute-0 podman[213986]: 2025-12-10 10:21:45.871551832 +0000 UTC m=+0.051124628 container died d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 10 10:21:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22-userdata-shm.mount: Deactivated successfully.
Dec 10 10:21:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd7b347a8c3f6cdfae26f531cdfcfe6591c2b75daf7331ed5bace4fd422ea74b-merged.mount: Deactivated successfully.
Dec 10 10:21:45 compute-0 podman[213986]: 2025-12-10 10:21:45.918992628 +0000 UTC m=+0.098565424 container cleanup d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.937 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.939 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.942 186993 INFO nova.virt.libvirt.driver [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Instance destroyed successfully.
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.942 186993 DEBUG nova.objects.instance [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 70a74a19-d800-4441-ae54-2289aed3ee93 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:21:45 compute-0 systemd[1]: libpod-conmon-d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22.scope: Deactivated successfully.
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.955 186993 DEBUG nova.virt.libvirt.vif [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:20:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1525504432',display_name='tempest-TestNetworkBasicOps-server-1525504432',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1525504432',id=1,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOC6rK9F/ucDVofHFBa4F/1C0QjnIdaWZG2fWPFLbTZPf05eQ2wAiWAsLJ4rFrU4CRwA3UP4SHrJF3f+0pj0vvX7tiyE0cT3cm/SzlIJPJkQR0Xuox9T9cjPlhHFH6cC4g==',key_name='tempest-TestNetworkBasicOps-166157555',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:20:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-fgkp9kpt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:20:42Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=70a74a19-d800-4441-ae54-2289aed3ee93,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.956 186993 DEBUG nova.network.os_vif_util [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.959 186993 DEBUG nova.network.os_vif_util [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.960 186993 DEBUG os_vif [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.962 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.963 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71336fed-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:45 compute-0 nova_compute[186989]: 2025-12-10 10:21:45.997 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.000 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.002 186993 INFO os_vif [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:a7:21,bridge_name='br-int',has_traffic_filtering=True,id=71336fed-be63-43d7-a554-f86e2de83e54,network=Network(77ce0e41-bb52-4715-b214-a29a8dab4ac8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71336fed-be')
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.003 186993 INFO nova.virt.libvirt.driver [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Deleting instance files /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93_del
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.004 186993 INFO nova.virt.libvirt.driver [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Deletion of /var/lib/nova/instances/70a74a19-d800-4441-ae54-2289aed3ee93_del complete
Dec 10 10:21:46 compute-0 podman[214032]: 2025-12-10 10:21:46.018540398 +0000 UTC m=+0.072482852 container remove d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.023 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[48a93788-e2c7-4716-a529-a15b953ec2f7]: (4, ('Wed Dec 10 10:21:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8 (d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22)\nd4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22\nWed Dec 10 10:21:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8 (d4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22)\nd4546fa63c3634a0e91a5836a152cc5fde665abe486c56240e58238713c7df22\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.025 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6e69df62-0877-4403-9dd6-27b61f17e411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.026 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77ce0e41-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:21:46 compute-0 kernel: tap77ce0e41-b0: left promiscuous mode
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.030 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.033 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b41f8c-aeb0-4b4c-8226-9b5a84f3481c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.042 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.049 186993 INFO nova.compute.manager [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Took 0.38 seconds to destroy the instance on the hypervisor.
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.050 186993 DEBUG oslo.service.loopingcall [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.050 186993 DEBUG nova.compute.manager [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.050 186993 DEBUG nova.network.neutron [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.053 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8411352d-255b-4bee-907b-7eb953c95f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.054 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[738e1562-88ad-44d2-8665-5619c149a6f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.071 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa65bd7-fbf7-4aeb-aead-f584ea775c19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 306066, 'reachable_time': 27878, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214048, 'error': None, 'target': 'ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.074 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-77ce0e41-bb52-4715-b214-a29a8dab4ac8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:21:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:21:46.074 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[968dded1-1f52-4c1a-a685-595e422bcb5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:21:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d77ce0e41\x2dbb52\x2d4715\x2db214\x2da29a8dab4ac8.mount: Deactivated successfully.
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.773 186993 DEBUG nova.network.neutron [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.791 186993 INFO nova.compute.manager [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Took 0.74 seconds to deallocate network for instance.
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.834 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.835 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.842 186993 DEBUG nova.compute.manager [req-bc3de93e-7cbc-4560-996e-98085cacea07 req-572d6c62-f063-4d16-a31e-ee54afd7a6dd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-vif-deleted-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.889 186993 DEBUG nova.compute.provider_tree [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.904 186993 DEBUG nova.scheduler.client.report [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.925 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.937 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.938 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.938 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:21:46 compute-0 nova_compute[186989]: 2025-12-10 10:21:46.952 186993 INFO nova.scheduler.client.report [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 70a74a19-d800-4441-ae54-2289aed3ee93
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.012 186993 DEBUG oslo_concurrency.lockutils [None req-4cf8f28a-b672-4c92-9394-3dc52142d995 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.036 186993 DEBUG nova.network.neutron [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updated VIF entry in instance network info cache for port 71336fed-be63-43d7-a554-f86e2de83e54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.037 186993 DEBUG nova.network.neutron [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Updating instance_info_cache with network_info: [{"id": "71336fed-be63-43d7-a554-f86e2de83e54", "address": "fa:16:3e:15:a7:21", "network": {"id": "77ce0e41-bb52-4715-b214-a29a8dab4ac8", "bridge": "br-int", "label": "tempest-network-smoke--1887473496", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap71336fed-be", "ovs_interfaceid": "71336fed-be63-43d7-a554-f86e2de83e54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.053 186993 DEBUG oslo_concurrency.lockutils [req-56f9f6f4-ddf9-4049-8134-286773919a0e req-38c49331-9c17-417c-ac54-6bbdb9bb3e57 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-70a74a19-d800-4441-ae54-2289aed3ee93" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.117 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.611 186993 DEBUG nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-vif-unplugged-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.612 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.612 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.613 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.613 186993 DEBUG nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] No waiting events found dispatching network-vif-unplugged-71336fed-be63-43d7-a554-f86e2de83e54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.614 186993 WARNING nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received unexpected event network-vif-unplugged-71336fed-be63-43d7-a554-f86e2de83e54 for instance with vm_state deleted and task_state None.
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.614 186993 DEBUG nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.615 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.615 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.615 186993 DEBUG oslo_concurrency.lockutils [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "70a74a19-d800-4441-ae54-2289aed3ee93-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.616 186993 DEBUG nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] No waiting events found dispatching network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.616 186993 WARNING nova.compute.manager [req-22e02c9e-7d65-416f-ac61-64b7ca985ef2 req-b14fd7cc-9d61-44e4-8e9b-1540650ba6c4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Received unexpected event network-vif-plugged-71336fed-be63-43d7-a554-f86e2de83e54 for instance with vm_state deleted and task_state None.
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:47 compute-0 nova_compute[186989]: 2025-12-10 10:21:47.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.965 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.965 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.965 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:48 compute-0 nova_compute[186989]: 2025-12-10 10:21:48.965 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.189 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.191 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5755MB free_disk=73.33386611938477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.191 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.191 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.236 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.237 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.258 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.272 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.306 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:21:49 compute-0 nova_compute[186989]: 2025-12-10 10:21:49.307 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:21:50 compute-0 nova_compute[186989]: 2025-12-10 10:21:50.668 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:50 compute-0 nova_compute[186989]: 2025-12-10 10:21:50.773 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:50 compute-0 nova_compute[186989]: 2025-12-10 10:21:50.997 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:52 compute-0 podman[214052]: 2025-12-10 10:21:52.05623196 +0000 UTC m=+0.075950476 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 10 10:21:52 compute-0 podman[214053]: 2025-12-10 10:21:52.073904184 +0000 UTC m=+0.095029898 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:21:52 compute-0 podman[214054]: 2025-12-10 10:21:52.088624685 +0000 UTC m=+0.105336479 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 10 10:21:52 compute-0 nova_compute[186989]: 2025-12-10 10:21:52.118 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:55 compute-0 nova_compute[186989]: 2025-12-10 10:21:55.934 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362100.9328713, e684ab58-5cc5-41f8-8460-b90ff9621838 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:21:55 compute-0 nova_compute[186989]: 2025-12-10 10:21:55.934 186993 INFO nova.compute.manager [-] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] VM Stopped (Lifecycle Event)
Dec 10 10:21:55 compute-0 nova_compute[186989]: 2025-12-10 10:21:55.955 186993 DEBUG nova.compute.manager [None req-c725cf42-4a9f-4227-b7c5-d069794933d8 - - - - - -] [instance: e684ab58-5cc5-41f8-8460-b90ff9621838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:21:56 compute-0 nova_compute[186989]: 2025-12-10 10:21:56.026 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:21:57 compute-0 nova_compute[186989]: 2025-12-10 10:21:57.174 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:00 compute-0 podman[214114]: 2025-12-10 10:22:00.061944259 +0000 UTC m=+0.097377883 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Dec 10 10:22:00 compute-0 nova_compute[186989]: 2025-12-10 10:22:00.930 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362105.9290667, 70a74a19-d800-4441-ae54-2289aed3ee93 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:22:00 compute-0 nova_compute[186989]: 2025-12-10 10:22:00.931 186993 INFO nova.compute.manager [-] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] VM Stopped (Lifecycle Event)
Dec 10 10:22:00 compute-0 nova_compute[186989]: 2025-12-10 10:22:00.955 186993 DEBUG nova.compute.manager [None req-704b5c83-ebd1-4474-8834-b40d9d0334df - - - - - -] [instance: 70a74a19-d800-4441-ae54-2289aed3ee93] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.067 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.406 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.407 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.433 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.512 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.513 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.522 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.522 186993 INFO nova.compute.claims [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.679 186993 DEBUG nova.compute.provider_tree [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.701 186993 DEBUG nova.scheduler.client.report [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.731 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.732 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.772 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.772 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.790 186993 INFO nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.806 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.897 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.900 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.901 186993 INFO nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Creating image(s)
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.902 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.903 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.905 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.940 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.960 186993 DEBUG nova.policy [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.998 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:01 compute-0 nova_compute[186989]: 2025-12-10 10:22:01.999 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.000 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.025 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.088 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.090 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.130 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.132 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.132 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.189 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.191 186993 DEBUG nova.virt.disk.api [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.192 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.208 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.246 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.247 186993 DEBUG nova.virt.disk.api [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.248 186993 DEBUG nova.objects.instance [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.268 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.269 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Ensure instance console log exists: /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.270 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.270 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:02 compute-0 nova_compute[186989]: 2025-12-10 10:22:02.270 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:03 compute-0 podman[214150]: 2025-12-10 10:22:03.026836076 +0000 UTC m=+0.069548841 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:22:04 compute-0 nova_compute[186989]: 2025-12-10 10:22:04.286 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Successfully created port: 3a1b6a7a-e8f7-421a-af57-fb303d77f486 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.070 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.377 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Successfully updated port: 3a1b6a7a-e8f7-421a-af57-fb303d77f486 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.396 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.396 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.397 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.475 186993 DEBUG nova.compute.manager [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.476 186993 DEBUG nova.compute.manager [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing instance network info cache due to event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:22:06 compute-0 nova_compute[186989]: 2025-12-10 10:22:06.476 186993 DEBUG oslo_concurrency.lockutils [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:07 compute-0 nova_compute[186989]: 2025-12-10 10:22:07.142 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:22:07 compute-0 nova_compute[186989]: 2025-12-10 10:22:07.208 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.343 186993 DEBUG nova.network.neutron [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.370 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.371 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Instance network_info: |[{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.372 186993 DEBUG oslo_concurrency.lockutils [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.372 186993 DEBUG nova.network.neutron [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.375 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Start _get_guest_xml network_info=[{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.381 186993 WARNING nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.386 186993 DEBUG nova.virt.libvirt.host [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.387 186993 DEBUG nova.virt.libvirt.host [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.390 186993 DEBUG nova.virt.libvirt.host [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.391 186993 DEBUG nova.virt.libvirt.host [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.391 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.392 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.392 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.393 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.393 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.393 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.394 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.394 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.395 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.395 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.395 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.396 186993 DEBUG nova.virt.hardware [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.401 186993 DEBUG nova.virt.libvirt.vif [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:22:01Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.402 186993 DEBUG nova.network.os_vif_util [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.403 186993 DEBUG nova.network.os_vif_util [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.404 186993 DEBUG nova.objects.instance [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.420 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <uuid>06155ade-0041-467e-92e9-2fad99467514</uuid>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <name>instance-00000003</name>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:22:08</nova:creationTime>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:08 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <system>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="serial">06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="uuid">06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </system>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <os>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </os>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <features>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </features>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:de:a7:15"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <target dev="tap3a1b6a7a-e8"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log" append="off"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <video>
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </video>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:22:08 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:22:08 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:22:08 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:22:08 compute-0 nova_compute[186989]: </domain>
Dec 10 10:22:08 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.422 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Preparing to wait for external event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.422 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.423 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.423 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.424 186993 DEBUG nova.virt.libvirt.vif [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:22:01Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.424 186993 DEBUG nova.network.os_vif_util [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.425 186993 DEBUG nova.network.os_vif_util [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.425 186993 DEBUG os_vif [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.426 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.427 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.427 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.431 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.432 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a1b6a7a-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.432 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a1b6a7a-e8, col_values=(('external_ids', {'iface-id': '3a1b6a7a-e8f7-421a-af57-fb303d77f486', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:a7:15', 'vm-uuid': '06155ade-0041-467e-92e9-2fad99467514'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:08 compute-0 NetworkManager[55541]: <info>  [1765362128.4354] manager: (tap3a1b6a7a-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.434 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.437 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.442 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.444 186993 INFO os_vif [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8')
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.490 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.490 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.490 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:de:a7:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:22:08 compute-0 nova_compute[186989]: 2025-12-10 10:22:08.491 186993 INFO nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Using config drive
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.311 186993 INFO nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Creating config drive at /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.320 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnwn9akql execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.446 186993 DEBUG oslo_concurrency.processutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnwn9akql" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:22:09 compute-0 kernel: tap3a1b6a7a-e8: entered promiscuous mode
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.5196] manager: (tap3a1b6a7a-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Dec 10 10:22:09 compute-0 ovn_controller[95452]: 2025-12-10T10:22:09Z|00049|binding|INFO|Claiming lport 3a1b6a7a-e8f7-421a-af57-fb303d77f486 for this chassis.
Dec 10 10:22:09 compute-0 ovn_controller[95452]: 2025-12-10T10:22:09Z|00050|binding|INFO|3a1b6a7a-e8f7-421a-af57-fb303d77f486: Claiming fa:16:3e:de:a7:15 10.100.0.11
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.519 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.526 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.533 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:a7:15 10.100.0.11'], port_security=['fa:16:3e:de:a7:15 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '06155ade-0041-467e-92e9-2fad99467514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc47485a-aa98-4835-9458-700243dad059', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28db7515-464c-4bb8-b217-5d6a4c627744, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=3a1b6a7a-e8f7-421a-af57-fb303d77f486) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.534 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 3a1b6a7a-e8f7-421a-af57-fb303d77f486 in datapath 99e953a5-acb0-4f92-a7d6-2af75bab0205 bound to our chassis
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.534 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99e953a5-acb0-4f92-a7d6-2af75bab0205
Dec 10 10:22:09 compute-0 systemd-udevd[214192]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.550 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fde7296f-51db-44f9-8ecd-f03f4b15be61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.551 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99e953a5-a1 in ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.553 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99e953a5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.553 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[944cb2b4-77d4-4fd2-a772-e5688601e0f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.554 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3125b872-f616-455c-8921-de5a5b0681f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.5618] device (tap3a1b6a7a-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.5624] device (tap3a1b6a7a-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.573 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[f404ffc6-b6fa-4056-b25e-da68b89edf0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 systemd-machined[153379]: New machine qemu-3-instance-00000003.
Dec 10 10:22:09 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.608 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a4a37a-86a1-4982-8e4e-17ee98c7f063]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.611 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 ovn_controller[95452]: 2025-12-10T10:22:09Z|00051|binding|INFO|Setting lport 3a1b6a7a-e8f7-421a-af57-fb303d77f486 ovn-installed in OVS
Dec 10 10:22:09 compute-0 ovn_controller[95452]: 2025-12-10T10:22:09Z|00052|binding|INFO|Setting lport 3a1b6a7a-e8f7-421a-af57-fb303d77f486 up in Southbound
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.617 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.638 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c949a38b-0f81-4695-a9cd-0d01c6a26496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.6448] manager: (tap99e953a5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.644 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[df3dd4b1-0d91-49a4-a844-bb56c3111650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.674 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[5460f6ca-2ea1-4a0d-ab39-88055934c09e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.679 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe0c39a-7a7d-460d-840d-1d040254951b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.7008] device (tap99e953a5-a0): carrier: link connected
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.706 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[b431e4a8-49c7-4d9d-898c-bf443fb0efa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.724 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[43f3896f-872d-46f3-b521-50abd7b09a5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99e953a5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:8d:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 314491, 'reachable_time': 16670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214227, 'error': None, 'target': 'ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.745 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3339f0fc-821d-4dbb-aec0-6df88c7a9cf4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:8d0f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 314491, 'tstamp': 314491}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214228, 'error': None, 'target': 'ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.767 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[749f2185-e89d-4dad-a348-1282a5fd384f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99e953a5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:8d:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 314491, 'reachable_time': 16670, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214229, 'error': None, 'target': 'ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.809 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab9aaf1-3564-421c-b423-93566bd8f3a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.873 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[deb41683-a58f-409d-b0c0-6cf5e13f3f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.875 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99e953a5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.875 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.876 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99e953a5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.877 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 NetworkManager[55541]: <info>  [1765362129.8787] manager: (tap99e953a5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec 10 10:22:09 compute-0 kernel: tap99e953a5-a0: entered promiscuous mode
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.880 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.884 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99e953a5-a0, col_values=(('external_ids', {'iface-id': '7972de09-2cb7-4533-92bc-c9d1cd414652'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.886 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:09 compute-0 ovn_controller[95452]: 2025-12-10T10:22:09Z|00053|binding|INFO|Releasing lport 7972de09-2cb7-4533-92bc-c9d1cd414652 from this chassis (sb_readonly=0)
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.889 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99e953a5-acb0-4f92-a7d6-2af75bab0205.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99e953a5-acb0-4f92-a7d6-2af75bab0205.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.890 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c2ce0f-1b09-4538-8f0c-97121d8c955b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.891 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-99e953a5-acb0-4f92-a7d6-2af75bab0205
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/99e953a5-acb0-4f92-a7d6-2af75bab0205.pid.haproxy
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 99e953a5-acb0-4f92-a7d6-2af75bab0205
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:22:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:09.891 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'env', 'PROCESS_TAG=haproxy-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99e953a5-acb0-4f92-a7d6-2af75bab0205.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:22:09 compute-0 nova_compute[186989]: 2025-12-10 10:22:09.901 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.040 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362130.0393128, 06155ade-0041-467e-92e9-2fad99467514 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.040 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] VM Started (Lifecycle Event)
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.063 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.066 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362130.03954, 06155ade-0041-467e-92e9-2fad99467514 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.066 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] VM Paused (Lifecycle Event)
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.083 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.087 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.120 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.296 186993 DEBUG nova.compute.manager [req-4d5d247e-1350-4e7c-8853-aa14b618fe2c req-0171c702-62ac-4507-96a1-2402c903e5a8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.298 186993 DEBUG oslo_concurrency.lockutils [req-4d5d247e-1350-4e7c-8853-aa14b618fe2c req-0171c702-62ac-4507-96a1-2402c903e5a8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.298 186993 DEBUG oslo_concurrency.lockutils [req-4d5d247e-1350-4e7c-8853-aa14b618fe2c req-0171c702-62ac-4507-96a1-2402c903e5a8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.299 186993 DEBUG oslo_concurrency.lockutils [req-4d5d247e-1350-4e7c-8853-aa14b618fe2c req-0171c702-62ac-4507-96a1-2402c903e5a8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.299 186993 DEBUG nova.compute.manager [req-4d5d247e-1350-4e7c-8853-aa14b618fe2c req-0171c702-62ac-4507-96a1-2402c903e5a8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Processing event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.300 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:22:10 compute-0 podman[214268]: 2025-12-10 10:22:10.303266894 +0000 UTC m=+0.055918218 container create 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.305 186993 DEBUG nova.network.neutron [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updated VIF entry in instance network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.306 186993 DEBUG nova.network.neutron [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.308 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362130.3055408, 06155ade-0041-467e-92e9-2fad99467514 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.309 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] VM Resumed (Lifecycle Event)
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.312 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.317 186993 INFO nova.virt.libvirt.driver [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] Instance spawned successfully.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.319 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.322 186993 DEBUG oslo_concurrency.lockutils [req-0cf7cc88-1b7b-4ced-a439-35780b5abd0b req-00bac220-a272-4397-b0ba-da59f7f35ac8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:10 compute-0 systemd[1]: Started libpod-conmon-57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6.scope.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.341 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.348 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.351 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.351 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.352 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.352 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.353 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.353 186993 DEBUG nova.virt.libvirt.driver [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:22:10 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:22:10 compute-0 podman[214268]: 2025-12-10 10:22:10.273406838 +0000 UTC m=+0.026058202 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:22:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac9b9c0d854219fc97ccc4a64ed907868feb88db9b799730b8aeebb446e1008c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.388 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:22:10 compute-0 podman[214268]: 2025-12-10 10:22:10.391696671 +0000 UTC m=+0.144348035 container init 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 10 10:22:10 compute-0 podman[214268]: 2025-12-10 10:22:10.397079217 +0000 UTC m=+0.149730581 container start 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 10 10:22:10 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [NOTICE]   (214288) : New worker (214290) forked
Dec 10 10:22:10 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [NOTICE]   (214288) : Loading success.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.429 186993 INFO nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Took 8.53 seconds to spawn the instance on the hypervisor.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.430 186993 DEBUG nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.539 186993 INFO nova.compute.manager [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Took 9.06 seconds to build instance.
Dec 10 10:22:10 compute-0 nova_compute[186989]: 2025-12-10 10:22:10.559 186993 DEBUG oslo_concurrency.lockutils [None req-cb3674bb-0754-45d1-b461-c53266598825 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.210 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.358 186993 DEBUG nova.compute.manager [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.358 186993 DEBUG oslo_concurrency.lockutils [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.359 186993 DEBUG oslo_concurrency.lockutils [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.359 186993 DEBUG oslo_concurrency.lockutils [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.359 186993 DEBUG nova.compute.manager [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:12 compute-0 nova_compute[186989]: 2025-12-10 10:22:12.359 186993 WARNING nova.compute.manager [req-9ec60249-85c9-402d-b7ae-174921e18edb req-e9c4db5d-429c-42eb-8a46-7f037e33b9e2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 for instance with vm_state active and task_state None.
Dec 10 10:22:13 compute-0 nova_compute[186989]: 2025-12-10 10:22:13.436 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:14 compute-0 podman[214299]: 2025-12-10 10:22:14.022895634 +0000 UTC m=+0.067116295 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:22:16 compute-0 podman[214325]: 2025-12-10 10:22:16.015104313 +0000 UTC m=+0.057589045 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 10 10:22:16 compute-0 NetworkManager[55541]: <info>  [1765362136.1350] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 10 10:22:16 compute-0 NetworkManager[55541]: <info>  [1765362136.1358] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.134 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:16 compute-0 ovn_controller[95452]: 2025-12-10T10:22:16Z|00054|binding|INFO|Releasing lport 7972de09-2cb7-4533-92bc-c9d1cd414652 from this chassis (sb_readonly=0)
Dec 10 10:22:16 compute-0 ovn_controller[95452]: 2025-12-10T10:22:16Z|00055|binding|INFO|Releasing lport 7972de09-2cb7-4533-92bc-c9d1cd414652 from this chassis (sb_readonly=0)
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.165 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.170 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.512 186993 DEBUG nova.compute.manager [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.513 186993 DEBUG nova.compute.manager [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing instance network info cache due to event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.513 186993 DEBUG oslo_concurrency.lockutils [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.514 186993 DEBUG oslo_concurrency.lockutils [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:16 compute-0 nova_compute[186989]: 2025-12-10 10:22:16.514 186993 DEBUG nova.network.neutron [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:22:17 compute-0 nova_compute[186989]: 2025-12-10 10:22:17.212 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:17 compute-0 nova_compute[186989]: 2025-12-10 10:22:17.627 186993 DEBUG nova.network.neutron [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updated VIF entry in instance network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:22:17 compute-0 nova_compute[186989]: 2025-12-10 10:22:17.628 186993 DEBUG nova.network.neutron [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:17 compute-0 nova_compute[186989]: 2025-12-10 10:22:17.648 186993 DEBUG oslo_concurrency.lockutils [req-7a55fbbd-5e36-4cb1-8548-da3c239139a9 req-9c0b5caf-6556-49a2-9f9e-86d700853492 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:18 compute-0 nova_compute[186989]: 2025-12-10 10:22:18.438 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:22 compute-0 nova_compute[186989]: 2025-12-10 10:22:22.214 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:22 compute-0 podman[214367]: 2025-12-10 10:22:22.31575475 +0000 UTC m=+0.064844223 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 10 10:22:22 compute-0 podman[214366]: 2025-12-10 10:22:22.320208311 +0000 UTC m=+0.073942581 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:22:22 compute-0 podman[214368]: 2025-12-10 10:22:22.378920286 +0000 UTC m=+0.125373698 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 10 10:22:23 compute-0 ovn_controller[95452]: 2025-12-10T10:22:23Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:de:a7:15 10.100.0.11
Dec 10 10:22:23 compute-0 ovn_controller[95452]: 2025-12-10T10:22:23Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:de:a7:15 10.100.0.11
Dec 10 10:22:23 compute-0 nova_compute[186989]: 2025-12-10 10:22:23.443 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:27 compute-0 nova_compute[186989]: 2025-12-10 10:22:27.215 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:28 compute-0 nova_compute[186989]: 2025-12-10 10:22:28.446 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:28 compute-0 nova_compute[186989]: 2025-12-10 10:22:28.872 186993 INFO nova.compute.manager [None req-df91027a-f550-4fa0-a76f-13a70f64c88b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Get console output
Dec 10 10:22:28 compute-0 nova_compute[186989]: 2025-12-10 10:22:28.881 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:22:29 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:29.271 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:22:29 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:29.274 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:22:29 compute-0 nova_compute[186989]: 2025-12-10 10:22:29.335 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:31 compute-0 podman[214432]: 2025-12-10 10:22:31.042893161 +0000 UTC m=+0.078338552 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 10 10:22:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:31.463 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:31.463 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:31.464 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:32 compute-0 nova_compute[186989]: 2025-12-10 10:22:32.217 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:32 compute-0 nova_compute[186989]: 2025-12-10 10:22:32.572 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "interface-06155ade-0041-467e-92e9-2fad99467514-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:32 compute-0 nova_compute[186989]: 2025-12-10 10:22:32.574 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-06155ade-0041-467e-92e9-2fad99467514-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:32 compute-0 nova_compute[186989]: 2025-12-10 10:22:32.575 186993 DEBUG nova.objects.instance [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'flavor' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:33 compute-0 nova_compute[186989]: 2025-12-10 10:22:33.259 186993 DEBUG nova.objects.instance [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_requests' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:33 compute-0 nova_compute[186989]: 2025-12-10 10:22:33.273 186993 DEBUG nova.network.neutron [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:22:33 compute-0 nova_compute[186989]: 2025-12-10 10:22:33.398 186993 DEBUG nova.policy [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:22:33 compute-0 nova_compute[186989]: 2025-12-10 10:22:33.448 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:34 compute-0 podman[214455]: 2025-12-10 10:22:34.017777977 +0000 UTC m=+0.056688072 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:22:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:34.278 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:34 compute-0 nova_compute[186989]: 2025-12-10 10:22:34.327 186993 DEBUG nova.network.neutron [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Successfully created port: 06e93e85-4a32-4594-aa62-14281107bca2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.428 186993 DEBUG nova.network.neutron [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Successfully updated port: 06e93e85-4a32-4594-aa62-14281107bca2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.592 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.593 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.593 186993 DEBUG nova.network.neutron [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.828 186993 DEBUG nova.compute.manager [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-changed-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.830 186993 DEBUG nova.compute.manager [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing instance network info cache due to event network-changed-06e93e85-4a32-4594-aa62-14281107bca2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:22:35 compute-0 nova_compute[186989]: 2025-12-10 10:22:35.833 186993 DEBUG oslo_concurrency.lockutils [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.221 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.492 186993 DEBUG nova.network.neutron [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.517 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.518 186993 DEBUG oslo_concurrency.lockutils [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.519 186993 DEBUG nova.network.neutron [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing network info cache for port 06e93e85-4a32-4594-aa62-14281107bca2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.525 186993 DEBUG nova.virt.libvirt.vif [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.525 186993 DEBUG nova.network.os_vif_util [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.527 186993 DEBUG nova.network.os_vif_util [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.527 186993 DEBUG os_vif [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.528 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.529 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.530 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.538 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.538 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06e93e85-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.539 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06e93e85-4a, col_values=(('external_ids', {'iface-id': '06e93e85-4a32-4594-aa62-14281107bca2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:e8:c1', 'vm-uuid': '06155ade-0041-467e-92e9-2fad99467514'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.541 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.5426] manager: (tap06e93e85-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.543 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.549 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.550 186993 INFO os_vif [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a')
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.551 186993 DEBUG nova.virt.libvirt.vif [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.552 186993 DEBUG nova.network.os_vif_util [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.552 186993 DEBUG nova.network.os_vif_util [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.556 186993 DEBUG nova.virt.libvirt.guest [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] attach device xml: <interface type="ethernet">
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:d1:e8:c1"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <target dev="tap06e93e85-4a"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]: </interface>
Dec 10 10:22:37 compute-0 nova_compute[186989]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 10 10:22:37 compute-0 kernel: tap06e93e85-4a: entered promiscuous mode
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.5748] manager: (tap06e93e85-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Dec 10 10:22:37 compute-0 ovn_controller[95452]: 2025-12-10T10:22:37Z|00056|binding|INFO|Claiming lport 06e93e85-4a32-4594-aa62-14281107bca2 for this chassis.
Dec 10 10:22:37 compute-0 ovn_controller[95452]: 2025-12-10T10:22:37Z|00057|binding|INFO|06e93e85-4a32-4594-aa62-14281107bca2: Claiming fa:16:3e:d1:e8:c1 10.100.0.21
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.578 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.581 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.589 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:e8:c1 10.100.0.21'], port_security=['fa:16:3e:d1:e8:c1 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '06155ade-0041-467e-92e9-2fad99467514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b562f2f3-41da-4fcb-931f-ed0d05ebb198, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=06e93e85-4a32-4594-aa62-14281107bca2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.591 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 06e93e85-4a32-4594-aa62-14281107bca2 in datapath ae2c3369-0ad9-4308-8c8e-76562a3e5352 bound to our chassis
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.594 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ae2c3369-0ad9-4308-8c8e-76562a3e5352
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.608 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1563ea-590b-4022-a5a8-85c1359a0301]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.609 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapae2c3369-01 in ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.612 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapae2c3369-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.613 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7210e6f5-ba45-49df-9956-175686e7b8b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.613 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.614 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6efac4de-48f8-411f-a7f6-bcc77efd6ce2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_controller[95452]: 2025-12-10T10:22:37Z|00058|binding|INFO|Setting lport 06e93e85-4a32-4594-aa62-14281107bca2 ovn-installed in OVS
Dec 10 10:22:37 compute-0 ovn_controller[95452]: 2025-12-10T10:22:37Z|00059|binding|INFO|Setting lport 06e93e85-4a32-4594-aa62-14281107bca2 up in Southbound
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.618 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 systemd-udevd[214488]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.629 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a66705-4564-4c93-a23d-11ff05306ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.6423] device (tap06e93e85-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.6438] device (tap06e93e85-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.644 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[afb1aa6b-9ae0-4641-9c5f-4f92c5cec933]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.678 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3d9ad5-00e3-41de-9629-10fb4bab916a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.683 186993 DEBUG nova.virt.libvirt.driver [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.684 186993 DEBUG nova.virt.libvirt.driver [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.684 186993 DEBUG nova.virt.libvirt.driver [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:de:a7:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.684 186993 DEBUG nova.virt.libvirt.driver [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:d1:e8:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.685 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbf57fd-d1f9-4565-b2ac-653500692a5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.6868] manager: (tapae2c3369-00): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.710 186993 DEBUG nova.virt.libvirt.guest [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:37</nova:creationTime>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:37 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     <nova:port uuid="06e93e85-4a32-4594-aa62-14281107bca2">
Dec 10 10:22:37 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:22:37 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:37 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:37 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:37 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.722 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[08445964-a33d-4855-8693-9f6697c48773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.726 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1effc3c0-585b-45c8-8cdf-715abad0ea29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.732 186993 DEBUG oslo_concurrency.lockutils [None req-a6f273ef-5349-414f-ad32-3e12c2ededba 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-06155ade-0041-467e-92e9-2fad99467514-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.7532] device (tapae2c3369-00): carrier: link connected
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.758 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[80fbb988-8519-4f16-b350-7b42aa505614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.774 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[450d195a-901b-4b11-9c6a-796474fad663]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae2c3369-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:7e:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317297, 'reachable_time': 38985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214513, 'error': None, 'target': 'ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.792 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbce37f-68e4-43ea-89d2-07c7a47f2139]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:7e8e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 317297, 'tstamp': 317297}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214514, 'error': None, 'target': 'ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.812 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d73359-eb7f-4fca-a060-bb8e5056a6ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapae2c3369-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:7e:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317297, 'reachable_time': 38985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214515, 'error': None, 'target': 'ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.848 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[e1968445-ef29-4239-9159-5710c33db8e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.920 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[49d5fc74-8f0a-4f46-89b8-c654c47005d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.921 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae2c3369-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.922 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.922 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae2c3369-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.924 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 NetworkManager[55541]: <info>  [1765362157.9254] manager: (tapae2c3369-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 10 10:22:37 compute-0 kernel: tapae2c3369-00: entered promiscuous mode
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.926 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.927 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapae2c3369-00, col_values=(('external_ids', {'iface-id': '1264f9e8-7484-4ed6-958e-5fb392cff35a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.928 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 ovn_controller[95452]: 2025-12-10T10:22:37Z|00060|binding|INFO|Releasing lport 1264f9e8-7484-4ed6-958e-5fb392cff35a from this chassis (sb_readonly=0)
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.934 186993 DEBUG nova.compute.manager [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.935 186993 DEBUG oslo_concurrency.lockutils [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.935 186993 DEBUG oslo_concurrency.lockutils [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.935 186993 DEBUG oslo_concurrency.lockutils [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.936 186993 DEBUG nova.compute.manager [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.936 186993 WARNING nova.compute.manager [req-ef8a78af-65bd-4216-85fb-0ee6f31f4223 req-880eb2c7-d697-4ea9-9b85-0687b756d0f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 for instance with vm_state active and task_state None.
Dec 10 10:22:37 compute-0 nova_compute[186989]: 2025-12-10 10:22:37.940 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.942 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ae2c3369-0ad9-4308-8c8e-76562a3e5352.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ae2c3369-0ad9-4308-8c8e-76562a3e5352.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.943 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[0e17e236-964e-48d0-bd3c-0ac5bccae5a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.944 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-ae2c3369-0ad9-4308-8c8e-76562a3e5352
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/ae2c3369-0ad9-4308-8c8e-76562a3e5352.pid.haproxy
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID ae2c3369-0ad9-4308-8c8e-76562a3e5352
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:22:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:37.945 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'env', 'PROCESS_TAG=haproxy-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ae2c3369-0ad9-4308-8c8e-76562a3e5352.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:22:38 compute-0 podman[214546]: 2025-12-10 10:22:38.304307368 +0000 UTC m=+0.022817935 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:22:38 compute-0 nova_compute[186989]: 2025-12-10 10:22:38.622 186993 DEBUG nova.network.neutron [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updated VIF entry in instance network info cache for port 06e93e85-4a32-4594-aa62-14281107bca2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:22:38 compute-0 nova_compute[186989]: 2025-12-10 10:22:38.623 186993 DEBUG nova.network.neutron [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:38 compute-0 nova_compute[186989]: 2025-12-10 10:22:38.646 186993 DEBUG oslo_concurrency.lockutils [req-436f681a-24c1-432e-a5ce-6ac6db0fb175 req-69afde88-91cc-4007-8dae-243bf754deb2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.040 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "interface-06155ade-0041-467e-92e9-2fad99467514-06e93e85-4a32-4594-aa62-14281107bca2" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.040 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-06155ade-0041-467e-92e9-2fad99467514-06e93e85-4a32-4594-aa62-14281107bca2" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.057 186993 DEBUG nova.objects.instance [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'flavor' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.083 186993 DEBUG nova.virt.libvirt.vif [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.084 186993 DEBUG nova.network.os_vif_util [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.085 186993 DEBUG nova.network.os_vif_util [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.091 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.095 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.099 186993 DEBUG nova.virt.libvirt.driver [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Attempting to detach device tap06e93e85-4a from instance 06155ade-0041-467e-92e9-2fad99467514 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.100 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] detach device xml: <interface type="ethernet">
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:d1:e8:c1"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <target dev="tap06e93e85-4a"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </interface>
Dec 10 10:22:39 compute-0 nova_compute[186989]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 10 10:22:39 compute-0 podman[214546]: 2025-12-10 10:22:39.461392489 +0000 UTC m=+1.179903066 container create 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.465 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:39 compute-0 ovn_controller[95452]: 2025-12-10T10:22:39Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:e8:c1 10.100.0.21
Dec 10 10:22:39 compute-0 ovn_controller[95452]: 2025-12-10T10:22:39Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:e8:c1 10.100.0.21
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.472 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <name>instance-00000003</name>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <uuid>06155ade-0041-467e-92e9-2fad99467514</uuid>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:37</nova:creationTime>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:port uuid="06e93e85-4a32-4594-aa62-14281107bca2">
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <system>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='serial'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='uuid'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </system>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <os>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </os>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <features>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </features>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk' index='2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config' index='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:de:a7:15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='tap3a1b6a7a-e8'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:d1:e8:c1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='tap06e93e85-4a'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='net1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       </target>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </console>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <video>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </video>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c27,c854</label>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c27,c854</imagelabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </domain>
Dec 10 10:22:39 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.474 186993 INFO nova.virt.libvirt.driver [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully detached device tap06e93e85-4a from instance 06155ade-0041-467e-92e9-2fad99467514 from the persistent domain config.
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.475 186993 DEBUG nova.virt.libvirt.driver [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] (1/8): Attempting to detach device tap06e93e85-4a with device alias net1 from instance 06155ade-0041-467e-92e9-2fad99467514 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.475 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] detach device xml: <interface type="ethernet">
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:d1:e8:c1"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <target dev="tap06e93e85-4a"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </interface>
Dec 10 10:22:39 compute-0 nova_compute[186989]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 10 10:22:39 compute-0 systemd[1]: Started libpod-conmon-3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443.scope.
Dec 10 10:22:39 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:22:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4772c6219deb7a35db8ff7da1670f0134304069ddb9387dc732a1cda181a950/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:22:39 compute-0 podman[214546]: 2025-12-10 10:22:39.576993271 +0000 UTC m=+1.295503838 container init 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:22:39 compute-0 podman[214546]: 2025-12-10 10:22:39.583639153 +0000 UTC m=+1.302149700 container start 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:22:39 compute-0 kernel: tap06e93e85-4a (unregistering): left promiscuous mode
Dec 10 10:22:39 compute-0 NetworkManager[55541]: <info>  [1765362159.5880] device (tap06e93e85-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:22:39 compute-0 ovn_controller[95452]: 2025-12-10T10:22:39Z|00061|binding|INFO|Releasing lport 06e93e85-4a32-4594-aa62-14281107bca2 from this chassis (sb_readonly=0)
Dec 10 10:22:39 compute-0 ovn_controller[95452]: 2025-12-10T10:22:39Z|00062|binding|INFO|Setting lport 06e93e85-4a32-4594-aa62-14281107bca2 down in Southbound
Dec 10 10:22:39 compute-0 ovn_controller[95452]: 2025-12-10T10:22:39Z|00063|binding|INFO|Removing iface tap06e93e85-4a ovn-installed in OVS
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.597 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [NOTICE]   (214565) : New worker (214571) forked
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [NOTICE]   (214565) : Loading success.
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.609 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.613 186993 DEBUG nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Received event <DeviceRemovedEvent: 1765362159.6135664, 06155ade-0041-467e-92e9-2fad99467514 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.617 186993 DEBUG nova.virt.libvirt.driver [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Start waiting for the detach event from libvirt for device tap06e93e85-4a with device alias net1 for instance 06155ade-0041-467e-92e9-2fad99467514 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.618 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.618 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.622 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <name>instance-00000003</name>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <uuid>06155ade-0041-467e-92e9-2fad99467514</uuid>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:37</nova:creationTime>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:port uuid="06e93e85-4a32-4594-aa62-14281107bca2">
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <system>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='serial'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='uuid'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </system>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <os>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </os>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <features>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </features>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk' index='2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config' index='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:de:a7:15'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target dev='tap3a1b6a7a-e8'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       </target>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </console>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <video>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </video>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c27,c854</label>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c27,c854</imagelabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </domain>
Dec 10 10:22:39 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.622 186993 INFO nova.virt.libvirt.driver [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully detached device tap06e93e85-4a from instance 06155ade-0041-467e-92e9-2fad99467514 from the live domain config.
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.623 186993 DEBUG nova.virt.libvirt.vif [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.623 186993 DEBUG nova.network.os_vif_util [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.624 186993 DEBUG nova.network.os_vif_util [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.624 186993 DEBUG os_vif [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.627 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.627 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e93e85-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.633 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:39.634 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:e8:c1 10.100.0.21'], port_security=['fa:16:3e:d1:e8:c1 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '06155ade-0041-467e-92e9-2fad99467514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b562f2f3-41da-4fcb-931f-ed0d05ebb198, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=06e93e85-4a32-4594-aa62-14281107bca2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.636 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.638 186993 INFO os_vif [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a')
Dec 10 10:22:39 compute-0 nova_compute[186989]: 2025-12-10 10:22:39.639 186993 DEBUG nova.virt.libvirt.guest [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:39</nova:creationTime>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:39 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:39 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:39 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:39 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:39 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:22:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:39.680 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 06e93e85-4a32-4594-aa62-14281107bca2 in datapath ae2c3369-0ad9-4308-8c8e-76562a3e5352 unbound from our chassis
Dec 10 10:22:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:39.682 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae2c3369-0ad9-4308-8c8e-76562a3e5352, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:22:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:39.683 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[b101f973-15d0-4574-98b7-18a1c6ff835a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:39 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:39.684 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352 namespace which is not needed anymore
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [NOTICE]   (214565) : haproxy version is 2.8.14-c23fe91
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [NOTICE]   (214565) : path to executable is /usr/sbin/haproxy
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [WARNING]  (214565) : Exiting Master process...
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [ALERT]    (214565) : Current worker (214571) exited with code 143 (Terminated)
Dec 10 10:22:39 compute-0 neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352[214561]: [WARNING]  (214565) : All workers exited. Exiting... (0)
Dec 10 10:22:39 compute-0 systemd[1]: libpod-3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443.scope: Deactivated successfully.
Dec 10 10:22:39 compute-0 podman[214597]: 2025-12-10 10:22:39.88000994 +0000 UTC m=+0.068823184 container died 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:22:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443-userdata-shm.mount: Deactivated successfully.
Dec 10 10:22:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-b4772c6219deb7a35db8ff7da1670f0134304069ddb9387dc732a1cda181a950-merged.mount: Deactivated successfully.
Dec 10 10:22:39 compute-0 podman[214597]: 2025-12-10 10:22:39.925251417 +0000 UTC m=+0.114064671 container cleanup 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:22:39 compute-0 systemd[1]: libpod-conmon-3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443.scope: Deactivated successfully.
Dec 10 10:22:40 compute-0 podman[214627]: 2025-12-10 10:22:39.999906829 +0000 UTC m=+0.048235220 container remove 3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.006 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4fdb52-a826-4760-bdf4-1dc5fa252660]: (4, ('Wed Dec 10 10:22:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352 (3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443)\n3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443\nWed Dec 10 10:22:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352 (3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443)\n3756777e44a685081a595df6f549fcbba4465a92fce24bef25ce7674754c1443\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.008 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[e45aab55-d938-45b7-be6c-518907f31d26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.009 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae2c3369-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.011 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:40 compute-0 kernel: tapae2c3369-00: left promiscuous mode
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.023 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.025 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4a3d9c54-f4c6-4f9c-96ed-582dffc6586e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.040 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[10459046-3f59-4d23-9cc5-c429ad5b03a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.041 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[41a61ac5-2c6d-4b21-8b5c-4a7c9753c006]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.042 186993 DEBUG nova.compute.manager [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.043 186993 DEBUG oslo_concurrency.lockutils [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.043 186993 DEBUG oslo_concurrency.lockutils [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.044 186993 DEBUG oslo_concurrency.lockutils [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.044 186993 DEBUG nova.compute.manager [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:40 compute-0 nova_compute[186989]: 2025-12-10 10:22:40.044 186993 WARNING nova.compute.manager [req-566f065a-5284-428a-8b42-36d769bbaf25 req-1657a018-1543-4d76-b697-142b77039d39 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 for instance with vm_state active and task_state None.
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.057 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[931eeeeb-b236-4af6-85dc-ed412e7a544e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317288, 'reachable_time': 42245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214642, 'error': None, 'target': 'ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.061 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ae2c3369-0ad9-4308-8c8e-76562a3e5352 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:22:40 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:40.061 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7ed4d7-7a82-4665-a042-be206bf4642b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:40 compute-0 systemd[1]: run-netns-ovnmeta\x2dae2c3369\x2d0ad9\x2d4308\x2d8c8e\x2d76562a3e5352.mount: Deactivated successfully.
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.760 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.760 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.761 186993 DEBUG nova.network.neutron [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.812 186993 DEBUG nova.compute.manager [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-deleted-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.813 186993 INFO nova.compute.manager [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Neutron deleted interface 06e93e85-4a32-4594-aa62-14281107bca2; detaching it from the instance and deleting it from the info cache
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.813 186993 DEBUG nova.network.neutron [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.834 186993 DEBUG nova.objects.instance [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lazy-loading 'system_metadata' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.876 186993 DEBUG nova.objects.instance [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lazy-loading 'flavor' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.915 186993 DEBUG nova.virt.libvirt.vif [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.916 186993 DEBUG nova.network.os_vif_util [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.917 186993 DEBUG nova.network.os_vif_util [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.919 186993 DEBUG nova.virt.libvirt.guest [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.924 186993 DEBUG nova.virt.libvirt.guest [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <name>instance-00000003</name>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <uuid>06155ade-0041-467e-92e9-2fad99467514</uuid>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:39</nova:creationTime>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <system>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='serial'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='uuid'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </system>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <os>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </os>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <features>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </features>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk' index='2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config' index='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:de:a7:15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='tap3a1b6a7a-e8'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       </target>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </console>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <video>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </video>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c27,c854</label>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c27,c854</imagelabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]: </domain>
Dec 10 10:22:41 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.926 186993 DEBUG nova.virt.libvirt.guest [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.931 186993 DEBUG nova.virt.libvirt.guest [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:d1:e8:c1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap06e93e85-4a"/></interface>not found in domain: <domain type='kvm' id='3'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <name>instance-00000003</name>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <uuid>06155ade-0041-467e-92e9-2fad99467514</uuid>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:39</nova:creationTime>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <system>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='serial'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='uuid'>06155ade-0041-467e-92e9-2fad99467514</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </system>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <os>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </os>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <features>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </features>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk' index='2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/disk.config' index='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:de:a7:15'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target dev='tap3a1b6a7a-e8'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       </target>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514/console.log' append='off'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </console>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </input>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <video>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </video>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c27,c854</label>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c27,c854</imagelabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:22:41 compute-0 nova_compute[186989]: </domain>
Dec 10 10:22:41 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.934 186993 WARNING nova.virt.libvirt.driver [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Detaching interface fa:16:3e:d1:e8:c1 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap06e93e85-4a' not found.
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.936 186993 DEBUG nova.virt.libvirt.vif [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.937 186993 DEBUG nova.network.os_vif_util [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converting VIF {"id": "06e93e85-4a32-4594-aa62-14281107bca2", "address": "fa:16:3e:d1:e8:c1", "network": {"id": "ae2c3369-0ad9-4308-8c8e-76562a3e5352", "bridge": "br-int", "label": "tempest-network-smoke--617965492", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06e93e85-4a", "ovs_interfaceid": "06e93e85-4a32-4594-aa62-14281107bca2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.938 186993 DEBUG nova.network.os_vif_util [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.938 186993 DEBUG os_vif [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.940 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.940 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06e93e85-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.940 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.944 186993 INFO os_vif [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:e8:c1,bridge_name='br-int',has_traffic_filtering=True,id=06e93e85-4a32-4594-aa62-14281107bca2,network=Network(ae2c3369-0ad9-4308-8c8e-76562a3e5352),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06e93e85-4a')
Dec 10 10:22:41 compute-0 nova_compute[186989]: 2025-12-10 10:22:41.945 186993 DEBUG nova.virt.libvirt.guest [req-b9b71012-e6a9-43e9-99e3-8c362979c89c req-fd3ab202-e98e-4a59-8650-76d2ba0c9630 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-553117970</nova:name>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:22:41</nova:creationTime>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     <nova:port uuid="3a1b6a7a-e8f7-421a-af57-fb303d77f486">
Dec 10 10:22:41 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:22:41 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:22:41 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:22:41 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:22:41 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.122 186993 DEBUG nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-unplugged-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.122 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.123 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.123 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.123 186993 DEBUG nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-unplugged-06e93e85-4a32-4594-aa62-14281107bca2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.123 186993 WARNING nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-unplugged-06e93e85-4a32-4594-aa62-14281107bca2 for instance with vm_state active and task_state None.
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.123 186993 DEBUG nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.124 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.124 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.124 186993 DEBUG oslo_concurrency.lockutils [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.124 186993 DEBUG nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.125 186993 WARNING nova.compute.manager [req-cee83d1b-6e0a-4875-9f8e-e53db6b5bbbc req-d289a9a7-877e-4934-bf56-7c1213c55cec 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-plugged-06e93e85-4a32-4594-aa62-14281107bca2 for instance with vm_state active and task_state None.
Dec 10 10:22:42 compute-0 nova_compute[186989]: 2025-12-10 10:22:42.226 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:44 compute-0 nova_compute[186989]: 2025-12-10 10:22:44.407 186993 INFO nova.network.neutron [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Port 06e93e85-4a32-4594-aa62-14281107bca2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 10 10:22:44 compute-0 nova_compute[186989]: 2025-12-10 10:22:44.408 186993 DEBUG nova.network.neutron [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:44 compute-0 nova_compute[186989]: 2025-12-10 10:22:44.603 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:44 compute-0 nova_compute[186989]: 2025-12-10 10:22:44.633 186993 DEBUG oslo_concurrency.lockutils [None req-9005b3f6-615e-4fe7-893a-088adb91cbd6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-06155ade-0041-467e-92e9-2fad99467514-06e93e85-4a32-4594-aa62-14281107bca2" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:44 compute-0 nova_compute[186989]: 2025-12-10 10:22:44.635 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:45 compute-0 podman[214643]: 2025-12-10 10:22:45.00850133 +0000 UTC m=+0.055280022 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:22:45 compute-0 nova_compute[186989]: 2025-12-10 10:22:45.306 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.425 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '06155ade-0041-467e-92e9-2fad99467514', 'name': 'tempest-TestNetworkBasicOps-server-553117970', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '82da19f85bb840d2a70395c3d761ef38', 'user_id': '603f9c3a99e145e4a64248329321a249', 'hostId': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.426 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.441 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/cpu volume: 11370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00356fa0-9533-4296-baa3-12ce64e8ffba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11370000000, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514', 'timestamp': '2025-12-10T10:22:45.426280', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2b374664-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71504707, 'message_signature': '25e5cf4e4c5473f66c348fef6517c813520ebef0d0ec9d87e2600063e4583e86'}]}, 'timestamp': '2025-12-10 10:22:45.442529', '_unique_id': '90bfba87fd50400187ae35faefddd9f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.468 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.bytes volume: 73019392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.469 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d8d176d-1215-4f4c-9b21-12a0ac303f1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73019392, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.445028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b3b67bc-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '6ff382c88f77b87e9707847fbac0fbe0efa3f6bf4985d68a81b07f90737511c5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.445028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b3b813e-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '8c20d6ff83f464d354dbac9b3351c8e964f7038c5b22a0a7de87923083042a93'}]}, 'timestamp': '2025-12-10 10:22:45.470311', '_unique_id': '0242f8090d5048f48b152af462c5c161'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.472 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.473 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.476 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 06155ade-0041-467e-92e9-2fad99467514 / tap3a1b6a7a-e8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.477 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83a3ad16-e58b-46f8-85a4-b9f69b5b89ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.474051', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b3ca848-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': 'fbf02082920aab124a928eaef3a859c63b600f62e4641db4ebceb33d19d70d5f'}]}, 'timestamp': '2025-12-10 10:22:45.477911', '_unique_id': '79c14de29adc4690a902f696636c2b66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.479 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.481 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.481 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.481 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>]
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.481 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.482 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.bytes volume: 31246848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.482 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b569917-2a48-47cc-a1a0-2939bfb36f2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31246848, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.482035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b3d6044-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'b8d30f85811f4f28fad3af324b01181be9984fc366670fe4f20fcbcaa65aab03'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.482035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b3d73ea-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'b431844054f5b4a0e55aeff6989125f3ec507f37bdc88d17c91265729f42eeb7'}]}, 'timestamp': '2025-12-10 10:22:45.483098', '_unique_id': 'e6ab2fe578834ccbb492b8d07aab5629'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.484 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.486 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c1d4176-aa59-4720-9c82-9f61c3024109', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.486061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b3dfd88-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': 'c89da5cba79a20302ba099e664376f4260fccdba4354fd0ce4eb9960a948c432'}]}, 'timestamp': '2025-12-10 10:22:45.486567', '_unique_id': 'b138ab2704cd4ba986f2071176d42a76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.487 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.488 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.489 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.489 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>]
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.489 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.502 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.502 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7d0546c-9a31-42dd-ac6d-b1ac067414f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.489529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b4073f6-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': '4a23bcb64a42d03a739fbfcd3d49890ae3fe703704fa2a21724b370faba28cee'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.489529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b40880a-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': '8a476092d75c36d01cbaeb7f040080528be27c2110a4b6ebd96a893ba62a549e'}]}, 'timestamp': '2025-12-10 10:22:45.503195', '_unique_id': '04d1d1441cd4494ca6748a879e0c14cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.504 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.506 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.506 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>]
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.506 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.506 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.506 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9c36ee3-66f3-432f-837a-5a075b0c090d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.506461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b411734-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': '68a87e42c5b4bd6af46dbdeb6954f4cc96e3c0f77d3330a58757864aa7c30ddf'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.506461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b41249a-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': 'b7a06ac0781d1a682375b452224f7c5a37d9f97cc684a4e9eed6b36403dbd0ee'}]}, 'timestamp': '2025-12-10 10:22:45.507128', '_unique_id': '67ad8e5f245a49b68d4ec15e0f83f678'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.507 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.508 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c93b52cd-a435-4dd1-9f89-5c5521a4d22e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.508883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b41758a-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': '79c60a6640db2f2b494e7d49e5b4c852acc50bba0b111023bd51b8805df3461e'}]}, 'timestamp': '2025-12-10 10:22:45.509249', '_unique_id': '4217982a71a44ed49365e5cfcc5c39f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.511 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.outgoing.packets volume: 150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '862eb458-5c45-4b17-a955-47ff5771e57c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 150, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.511047', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b41c9f4-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': 'd7ee38e7d8186f8a4a25e1710e81a92a373e9fc14d90920f7868695d7c5723dd'}]}, 'timestamp': '2025-12-10 10:22:45.511382', '_unique_id': '93be2667f1734710bec72c1ad0934414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.513 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3fe9737-cd9f-4de6-9e9c-689f1715f6f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.513068', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b4218dc-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': '060f026568a77f9c846dfe6fb394fe69fa03c4c6d3576ec7dc5fd0b3475a9438'}]}, 'timestamp': '2025-12-10 10:22:45.513402', '_unique_id': '626d9b75450f4bc9b9cbddf7c5385a9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.514 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.515 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.515 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.515 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92fb8f43-cea7-4261-87f4-fc15272ae5d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.515517', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b427980-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': 'dfd1155f953d26ebc91e744084794da5130159c00d029b32a2a3f26f22f304ca'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.515517', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b428a42-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.762900998, 'message_signature': 'e8eed02e0f325f00638c2d15572c20647247020f5276e48bd0d0ca485f101282'}]}, 'timestamp': '2025-12-10 10:22:45.516318', '_unique_id': '610def986cd34d49b1f8cc7f201cd9ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.517 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.518 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.518 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/memory.usage volume: 43.0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8309cb8e-bb63-4578-a146-86a28a824a51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514', 'timestamp': '2025-12-10T10:22:45.518287', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2b42e5dc-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71504707, 'message_signature': '3bbf58808d08fcc2124ee8c3f7288954b1381fe92a04dd7394099fd3befa0832'}]}, 'timestamp': '2025-12-10 10:22:45.518666', '_unique_id': '474dff96959345e6a91348a03c475f6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.519 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.520 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.520 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-553117970>]
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.521 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.521 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e64cdeff-d1af-482a-98d9-0ac8c1fe9940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.521127', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b4354fe-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': 'd894c4c89ea0c9547dda797e5eec7945dbbfe5ef525013742d36573e530b9ee1'}]}, 'timestamp': '2025-12-10 10:22:45.521529', '_unique_id': 'f5496917998848a29818734ac42d6656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.522 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.523 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.523 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.latency volume: 219360417 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.523 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.latency volume: 27636433 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c619e73-2f3e-4116-97ff-031ecc7c5a63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 219360417, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.523477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b43b0c0-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'd119ecde76b1f485f7d295a8bf4c3145c782ed1154c2d1dcfe63cff95c9e82ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27636433, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.523477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b43c902-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '18ee47955b533be07f3e7c65e0b9b75838f9d9e0356e742cf5c398a350331899'}]}, 'timestamp': '2025-12-10 10:22:45.524497', '_unique_id': 'db444495549d4cf3aca0f0c9ab0143ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.525 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.526 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.526 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.outgoing.bytes volume: 24066 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1787d37-e2d6-4122-976e-a13dc9388a6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24066, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.526691', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b443194-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': 'd267f0a18492177ccf3966554f69513fd0f5012910327b3255cd859cba905f0e'}]}, 'timestamp': '2025-12-10 10:22:45.527180', '_unique_id': '7bfed5f59fa3423886196d34cf7e8177'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.528 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.529 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.incoming.bytes volume: 28081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea693b07-b444-4acd-8147-f9a99627321a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28081, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.529547', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b44a17e-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': '11dd4a68fa9c6387a2cf54019f2be7d24b2a236baa04bb72b18c140713b84e5f'}]}, 'timestamp': '2025-12-10 10:22:45.530091', '_unique_id': '4a68230179254d64b7515c1512ea85e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.530 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.531 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7efc5f48-feb2-4a65-9795-c88635bc1287', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.531779', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b44f412-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': '2a3327213f453955f5a4ae1af7f17cf62f387a64711a4fbd264742473382442f'}]}, 'timestamp': '2025-12-10 10:22:45.532119', '_unique_id': 'ac4c12231d114ecebbf59a4a7cbd24e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.532 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.533 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.534 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972a8824-84af-4453-bcd1-42cbec719d84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.533688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b454098-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '4dc8444a521ec80f83f5f3125f75df8c1381e33d211f0fff8db9215801a20d41'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.533688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b454c78-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'd8b4d82e2d85ad93beff0abd2bfa9dc71ba84b3e85a939017f397056d6839cc8'}]}, 'timestamp': '2025-12-10 10:22:45.534361', '_unique_id': 'ac77334e7821432a9e0374a92eab0246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.535 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.536 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/network.incoming.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fc71bd9-635e-4064-8760-74f047ff82af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000003-06155ade-0041-467e-92e9-2fad99467514-tap3a1b6a7a-e8', 'timestamp': '2025-12-10T10:22:45.536026', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'tap3a1b6a7a-e8', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:de:a7:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3a1b6a7a-e8'}, 'message_id': '2b459ae8-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.747442506, 'message_signature': '125c0d19db9fdbb882a52f813e034299b91fd1fdac0c02e5baabe9d554c1ab4f'}]}, 'timestamp': '2025-12-10 10:22:45.536441', '_unique_id': 'b5e9962b9fce4b3d979dd7a10bdaa7b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.537 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.538 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.latency volume: 8024413041 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.538 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdb30d5b-3e53-4717-b868-df32916eb617', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8024413041, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.537989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b45e688-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '2fabd392ae39826a6f4f4849d1eed02072c2d952ef5b572cfe916ffd233a1247'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.537989', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b45f22c-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'd5b8bec767865c099512aa16421c068a41261b93add2c8c004b2151c6dab909d'}]}, 'timestamp': '2025-12-10 10:22:45.538598', '_unique_id': '1eac5be589f2417a9ee44a9abf3a2f2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.539 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.540 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.540 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.requests volume: 310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.540 12 DEBUG ceilometer.compute.pollsters [-] 06155ade-0041-467e-92e9-2fad99467514/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '902ed4c3-aa6c-43db-8a56-3f18df79e685', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 310, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-vda', 'timestamp': '2025-12-10T10:22:45.540192', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2b463d2c-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': 'c2c6423aeb0053668f2b0804e138bda1e0300cbdf1c0cce4eed608115b065404'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '06155ade-0041-467e-92e9-2fad99467514-sda', 'timestamp': '2025-12-10T10:22:45.540192', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-553117970', 'name': 'instance-00000003', 'instance_id': '06155ade-0041-467e-92e9-2fad99467514', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2b4648da-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3180.71835942, 'message_signature': '171a2c8c55a4ee1e5dfac77dbd9b85439c0fec7358ad50312ee486ced33dd96d'}]}, 'timestamp': '2025-12-10 10:22:45.540846', '_unique_id': 'd0f41fa184844de7946b6981f19ffe1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:22:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:22:45.541 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:22:46 compute-0 ovn_controller[95452]: 2025-12-10T10:22:46Z|00064|binding|INFO|Releasing lport 7972de09-2cb7-4533-92bc-c9d1cd414652 from this chassis (sb_readonly=0)
Dec 10 10:22:46 compute-0 nova_compute[186989]: 2025-12-10 10:22:46.568 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:47 compute-0 podman[214669]: 2025-12-10 10:22:47.013072312 +0000 UTC m=+0.055056057 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.229 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.726 186993 DEBUG nova.compute.manager [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.726 186993 DEBUG nova.compute.manager [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing instance network info cache due to event network-changed-3a1b6a7a-e8f7-421a-af57-fb303d77f486. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.726 186993 DEBUG oslo_concurrency.lockutils [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.726 186993 DEBUG oslo_concurrency.lockutils [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.726 186993 DEBUG nova.network.neutron [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Refreshing network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.808 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.809 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.809 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.810 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.810 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.813 186993 INFO nova.compute.manager [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Terminating instance
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.815 186993 DEBUG nova.compute.manager [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:22:47 compute-0 kernel: tap3a1b6a7a-e8 (unregistering): left promiscuous mode
Dec 10 10:22:47 compute-0 NetworkManager[55541]: <info>  [1765362167.8401] device (tap3a1b6a7a-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:22:47 compute-0 ovn_controller[95452]: 2025-12-10T10:22:47Z|00065|binding|INFO|Releasing lport 3a1b6a7a-e8f7-421a-af57-fb303d77f486 from this chassis (sb_readonly=0)
Dec 10 10:22:47 compute-0 ovn_controller[95452]: 2025-12-10T10:22:47Z|00066|binding|INFO|Setting lport 3a1b6a7a-e8f7-421a-af57-fb303d77f486 down in Southbound
Dec 10 10:22:47 compute-0 ovn_controller[95452]: 2025-12-10T10:22:47Z|00067|binding|INFO|Removing iface tap3a1b6a7a-e8 ovn-installed in OVS
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.843 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.845 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:47 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:47.850 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:a7:15 10.100.0.11'], port_security=['fa:16:3e:de:a7:15 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '06155ade-0041-467e-92e9-2fad99467514', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc47485a-aa98-4835-9458-700243dad059', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28db7515-464c-4bb8-b217-5d6a4c627744, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=3a1b6a7a-e8f7-421a-af57-fb303d77f486) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:22:47 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:47.851 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 3a1b6a7a-e8f7-421a-af57-fb303d77f486 in datapath 99e953a5-acb0-4f92-a7d6-2af75bab0205 unbound from our chassis
Dec 10 10:22:47 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:47.852 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99e953a5-acb0-4f92-a7d6-2af75bab0205, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:22:47 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:47.853 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6020a6-3588-46b6-ba51-090de91e5350]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:47 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:47.853 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205 namespace which is not needed anymore
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.859 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:47 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:22:47 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 13.782s CPU time.
Dec 10 10:22:47 compute-0 systemd-machined[153379]: Machine qemu-3-instance-00000003 terminated.
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.951 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.951 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.952 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.953 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:47 compute-0 nova_compute[186989]: 2025-12-10 10:22:47.953 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:22:48 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [NOTICE]   (214288) : haproxy version is 2.8.14-c23fe91
Dec 10 10:22:48 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [NOTICE]   (214288) : path to executable is /usr/sbin/haproxy
Dec 10 10:22:48 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [WARNING]  (214288) : Exiting Master process...
Dec 10 10:22:48 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [ALERT]    (214288) : Current worker (214290) exited with code 143 (Terminated)
Dec 10 10:22:48 compute-0 neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205[214284]: [WARNING]  (214288) : All workers exited. Exiting... (0)
Dec 10 10:22:48 compute-0 systemd[1]: libpod-57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6.scope: Deactivated successfully.
Dec 10 10:22:48 compute-0 podman[214712]: 2025-12-10 10:22:48.027392826 +0000 UTC m=+0.088162471 container died 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.041 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.047 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6-userdata-shm.mount: Deactivated successfully.
Dec 10 10:22:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-ac9b9c0d854219fc97ccc4a64ed907868feb88db9b799730b8aeebb446e1008c-merged.mount: Deactivated successfully.
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.060 186993 DEBUG nova.compute.manager [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-unplugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.060 186993 DEBUG oslo_concurrency.lockutils [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.060 186993 DEBUG oslo_concurrency.lockutils [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.061 186993 DEBUG oslo_concurrency.lockutils [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.061 186993 DEBUG nova.compute.manager [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-unplugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.062 186993 DEBUG nova.compute.manager [req-7a88fe07-b36c-4651-98ec-1690b46bbaca req-bb9825c1-8496-4636-9486-c505c056b2f5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-unplugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.089 186993 INFO nova.virt.libvirt.driver [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] Instance destroyed successfully.
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.089 186993 DEBUG nova.objects.instance [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 06155ade-0041-467e-92e9-2fad99467514 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.104 186993 DEBUG nova.virt.libvirt.vif [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-553117970',display_name='tempest-TestNetworkBasicOps-server-553117970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-553117970',id=3,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM9rulyR4iCiNffHiCIPsrHfxs4xEeqz3cOIfq/A+G1OidINlr8tKtpavNRs8X9mbGWMw4RRhgy5RN/1b5AV2X87wr9L+R9c+gEFdchYhXKlmXq1eyUtEqakYSehGvCdJg==',key_name='tempest-TestNetworkBasicOps-514887772',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:22:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-o3tuxyzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:22:10Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=06155ade-0041-467e-92e9-2fad99467514,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.105 186993 DEBUG nova.network.os_vif_util [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.105 186993 DEBUG nova.network.os_vif_util [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.105 186993 DEBUG os_vif [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.106 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.107 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a1b6a7a-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.150 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.154 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.157 186993 INFO os_vif [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:de:a7:15,bridge_name='br-int',has_traffic_filtering=True,id=3a1b6a7a-e8f7-421a-af57-fb303d77f486,network=Network(99e953a5-acb0-4f92-a7d6-2af75bab0205),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a1b6a7a-e8')
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.157 186993 INFO nova.virt.libvirt.driver [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Deleting instance files /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514_del
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.159 186993 INFO nova.virt.libvirt.driver [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Deletion of /var/lib/nova/instances/06155ade-0041-467e-92e9-2fad99467514_del complete
Dec 10 10:22:48 compute-0 podman[214712]: 2025-12-10 10:22:48.201323744 +0000 UTC m=+0.262093389 container cleanup 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 10 10:22:48 compute-0 systemd[1]: libpod-conmon-57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6.scope: Deactivated successfully.
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.222 186993 INFO nova.compute.manager [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.223 186993 DEBUG oslo.service.loopingcall [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.223 186993 DEBUG nova.compute.manager [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.224 186993 DEBUG nova.network.neutron [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:22:48 compute-0 podman[214756]: 2025-12-10 10:22:48.288239682 +0000 UTC m=+0.053164056 container remove 57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.296 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdc2a78-1e39-4c5b-8a68-19b9332b0aee]: (4, ('Wed Dec 10 10:22:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205 (57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6)\n57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6\nWed Dec 10 10:22:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205 (57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6)\n57b6021f171914e8c6fd8aafc3f091ed2782bfcb9a2670e482f31c7978400dc6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.299 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f0538f85-40f9-40cd-98d2-72e4e21314bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.300 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99e953a5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.301 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 kernel: tap99e953a5-a0: left promiscuous mode
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.312 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.317 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7b8b41-06e6-487d-baef-7aa28ca18a9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.332 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[553fd446-4724-434f-951e-0727d6b0806f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.333 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6e77e083-43d5-44f6-8ecf-527d543818a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.350 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[db80f3cf-2795-4b48-8f61-e3c7fde58653]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 314485, 'reachable_time': 21905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214771, 'error': None, 'target': 'ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d99e953a5\x2dacb0\x2d4f92\x2da7d6\x2d2af75bab0205.mount: Deactivated successfully.
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.354 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99e953a5-acb0-4f92-a7d6-2af75bab0205 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:22:48 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:22:48.355 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[30f651c8-5bb0-466f-832f-768cdefbe1be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:22:48 compute-0 nova_compute[186989]: 2025-12-10 10:22:48.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.406 186993 DEBUG nova.network.neutron [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updated VIF entry in instance network info cache for port 3a1b6a7a-e8f7-421a-af57-fb303d77f486. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.406 186993 DEBUG nova.network.neutron [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [{"id": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "address": "fa:16:3e:de:a7:15", "network": {"id": "99e953a5-acb0-4f92-a7d6-2af75bab0205", "bridge": "br-int", "label": "tempest-network-smoke--179672412", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a1b6a7a-e8", "ovs_interfaceid": "3a1b6a7a-e8f7-421a-af57-fb303d77f486", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.415 186993 DEBUG nova.network.neutron [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.524 186993 DEBUG oslo_concurrency.lockutils [req-278940e2-19d9-4345-bdf1-f11525e4db89 req-2049a3d5-e87d-4eef-836d-816b6be2628f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-06155ade-0041-467e-92e9-2fad99467514" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.526 186993 INFO nova.compute.manager [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] Took 1.30 seconds to deallocate network for instance.
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.565 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.566 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.631 186993 DEBUG nova.compute.provider_tree [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.657 186993 DEBUG nova.scheduler.client.report [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.686 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.707 186993 INFO nova.scheduler.client.report [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 06155ade-0041-467e-92e9-2fad99467514
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.764 186993 DEBUG oslo_concurrency.lockutils [None req-4b44c1bf-9fd4-4476-9e42-4e38a359c3d0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.801 186993 DEBUG nova.compute.manager [req-5fdaf718-1c3e-4ac2-8a30-cbc013430008 req-8c56fa6d-8f0a-458a-859e-5159c3966615 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-deleted-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.946 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.946 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.946 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:49 compute-0 nova_compute[186989]: 2025-12-10 10:22:49.946 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.104 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.105 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.32989883422852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.105 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.106 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.127 186993 DEBUG nova.compute.manager [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.128 186993 DEBUG oslo_concurrency.lockutils [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "06155ade-0041-467e-92e9-2fad99467514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.128 186993 DEBUG oslo_concurrency.lockutils [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.128 186993 DEBUG oslo_concurrency.lockutils [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "06155ade-0041-467e-92e9-2fad99467514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.128 186993 DEBUG nova.compute.manager [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] No waiting events found dispatching network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.129 186993 WARNING nova.compute.manager [req-ff7ad4ed-637b-46dc-953d-366070e2402f req-a5b44186-6dc8-4c2d-9c61-7f41056c3836 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 06155ade-0041-467e-92e9-2fad99467514] Received unexpected event network-vif-plugged-3a1b6a7a-e8f7-421a-af57-fb303d77f486 for instance with vm_state deleted and task_state None.
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.156 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.156 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.176 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.190 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.212 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:22:50 compute-0 nova_compute[186989]: 2025-12-10 10:22:50.212 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:22:52 compute-0 nova_compute[186989]: 2025-12-10 10:22:52.233 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:53 compute-0 podman[214774]: 2025-12-10 10:22:53.02579468 +0000 UTC m=+0.064236038 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:22:53 compute-0 podman[214775]: 2025-12-10 10:22:53.030500539 +0000 UTC m=+0.065361189 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:22:53 compute-0 podman[214776]: 2025-12-10 10:22:53.059862512 +0000 UTC m=+0.092907643 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 10 10:22:53 compute-0 nova_compute[186989]: 2025-12-10 10:22:53.150 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:54 compute-0 nova_compute[186989]: 2025-12-10 10:22:54.007 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:54 compute-0 nova_compute[186989]: 2025-12-10 10:22:54.094 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:57 compute-0 nova_compute[186989]: 2025-12-10 10:22:57.235 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:22:58 compute-0 nova_compute[186989]: 2025-12-10 10:22:58.154 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:02 compute-0 podman[214836]: 2025-12-10 10:23:02.031932236 +0000 UTC m=+0.075062744 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 10 10:23:02 compute-0 nova_compute[186989]: 2025-12-10 10:23:02.236 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:03 compute-0 nova_compute[186989]: 2025-12-10 10:23:03.089 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362168.0869174, 06155ade-0041-467e-92e9-2fad99467514 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:03 compute-0 nova_compute[186989]: 2025-12-10 10:23:03.091 186993 INFO nova.compute.manager [-] [instance: 06155ade-0041-467e-92e9-2fad99467514] VM Stopped (Lifecycle Event)
Dec 10 10:23:03 compute-0 nova_compute[186989]: 2025-12-10 10:23:03.116 186993 DEBUG nova.compute.manager [None req-94f23b64-d87c-4493-9ce6-808a5390de44 - - - - - -] [instance: 06155ade-0041-467e-92e9-2fad99467514] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:03 compute-0 nova_compute[186989]: 2025-12-10 10:23:03.156 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:05 compute-0 podman[214858]: 2025-12-10 10:23:05.076838484 +0000 UTC m=+0.110069891 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.238 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.669 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.670 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.690 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.780 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.781 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.791 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.791 186993 INFO nova.compute.claims [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.911 186993 DEBUG nova.compute.provider_tree [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.933 186993 DEBUG nova.scheduler.client.report [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.958 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:07 compute-0 nova_compute[186989]: 2025-12-10 10:23:07.959 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.015 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.016 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.037 186993 INFO nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.063 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.150 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.152 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.152 186993 INFO nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Creating image(s)
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.152 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.153 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.153 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.165 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.167 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.255 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.256 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.257 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.268 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.333 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.335 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.407 186993 DEBUG nova.policy [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.510 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk 1073741824" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.511 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.511 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.569 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.571 186993 DEBUG nova.virt.disk.api [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.571 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.629 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.631 186993 DEBUG nova.virt.disk.api [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.631 186993 DEBUG nova.objects.instance [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 247bc6b9-be00-4776-a195-f7a413953734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.654 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.654 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Ensure instance console log exists: /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.655 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.655 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:08 compute-0 nova_compute[186989]: 2025-12-10 10:23:08.656 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:09 compute-0 nova_compute[186989]: 2025-12-10 10:23:09.503 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Successfully created port: 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.655 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Successfully updated port: 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.677 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.677 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.677 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.752 186993 DEBUG nova.compute.manager [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.753 186993 DEBUG nova.compute.manager [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing instance network info cache due to event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.753 186993 DEBUG oslo_concurrency.lockutils [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:10 compute-0 nova_compute[186989]: 2025-12-10 10:23:10.821 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.836 186993 DEBUG nova.network.neutron [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.864 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.865 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Instance network_info: |[{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.865 186993 DEBUG oslo_concurrency.lockutils [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.865 186993 DEBUG nova.network.neutron [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.868 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Start _get_guest_xml network_info=[{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.872 186993 WARNING nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.878 186993 DEBUG nova.virt.libvirt.host [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.878 186993 DEBUG nova.virt.libvirt.host [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.881 186993 DEBUG nova.virt.libvirt.host [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.882 186993 DEBUG nova.virt.libvirt.host [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.882 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.883 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.883 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.883 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.883 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.883 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.884 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.884 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.884 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.884 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.884 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.885 186993 DEBUG nova.virt.hardware [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.888 186993 DEBUG nova.virt.libvirt.vif [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1661384698',display_name='tempest-TestNetworkBasicOps-server-1661384698',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1661384698',id=4,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOoM2wxE/oQobBkwJW25+OIMBISiWlzYSSD+B1Ou1SGSJf8q1c5yTWdQRHu9jb1S94waK2J9TkgkUN7gAAqPQ0ctAKoNCthN3t+SBmA2139FZclCZYcu0m9TIbhiQgf1EA==',key_name='tempest-TestNetworkBasicOps-478102849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-wy1scrod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:23:08Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=247bc6b9-be00-4776-a195-f7a413953734,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.888 186993 DEBUG nova.network.os_vif_util [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.889 186993 DEBUG nova.network.os_vif_util [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.890 186993 DEBUG nova.objects.instance [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 247bc6b9-be00-4776-a195-f7a413953734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.913 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <uuid>247bc6b9-be00-4776-a195-f7a413953734</uuid>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <name>instance-00000004</name>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1661384698</nova:name>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:23:11</nova:creationTime>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         <nova:port uuid="43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5">
Dec 10 10:23:11 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <system>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="serial">247bc6b9-be00-4776-a195-f7a413953734</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="uuid">247bc6b9-be00-4776-a195-f7a413953734</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </system>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <os>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </os>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <features>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </features>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.config"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:c1:14:09"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <target dev="tap43a5ac9c-0f"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/console.log" append="off"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <video>
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </video>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:23:11 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:23:11 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:23:11 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:23:11 compute-0 nova_compute[186989]: </domain>
Dec 10 10:23:11 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.915 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Preparing to wait for external event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.915 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.915 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.915 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.916 186993 DEBUG nova.virt.libvirt.vif [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1661384698',display_name='tempest-TestNetworkBasicOps-server-1661384698',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1661384698',id=4,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOoM2wxE/oQobBkwJW25+OIMBISiWlzYSSD+B1Ou1SGSJf8q1c5yTWdQRHu9jb1S94waK2J9TkgkUN7gAAqPQ0ctAKoNCthN3t+SBmA2139FZclCZYcu0m9TIbhiQgf1EA==',key_name='tempest-TestNetworkBasicOps-478102849',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-wy1scrod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:23:08Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=247bc6b9-be00-4776-a195-f7a413953734,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.916 186993 DEBUG nova.network.os_vif_util [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.917 186993 DEBUG nova.network.os_vif_util [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.917 186993 DEBUG os_vif [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.918 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.918 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.919 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.921 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.922 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43a5ac9c-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.922 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43a5ac9c-0f, col_values=(('external_ids', {'iface-id': '43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:14:09', 'vm-uuid': '247bc6b9-be00-4776-a195-f7a413953734'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.923 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:11 compute-0 NetworkManager[55541]: <info>  [1765362191.9244] manager: (tap43a5ac9c-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.926 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.934 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.935 186993 INFO os_vif [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f')
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.991 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.992 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.992 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:c1:14:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:23:11 compute-0 nova_compute[186989]: 2025-12-10 10:23:11.993 186993 INFO nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Using config drive
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.273 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.554 186993 INFO nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Creating config drive at /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.config
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.560 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5jhta__x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.683 186993 DEBUG oslo_concurrency.processutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5jhta__x" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:12 compute-0 kernel: tap43a5ac9c-0f: entered promiscuous mode
Dec 10 10:23:12 compute-0 ovn_controller[95452]: 2025-12-10T10:23:12Z|00068|binding|INFO|Claiming lport 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 for this chassis.
Dec 10 10:23:12 compute-0 NetworkManager[55541]: <info>  [1765362192.7481] manager: (tap43a5ac9c-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Dec 10 10:23:12 compute-0 ovn_controller[95452]: 2025-12-10T10:23:12Z|00069|binding|INFO|43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5: Claiming fa:16:3e:c1:14:09 10.100.0.11
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.749 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.762 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:14:09 10.100.0.11'], port_security=['fa:16:3e:c1:14:09 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '247bc6b9-be00-4776-a195-f7a413953734', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b98a055-b2ca-444f-b7fe-e5e3733253cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f15ac783-5514-42e3-9054-2f4e6c3b1dab, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.764 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 in datapath bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 bound to our chassis
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.766 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.782 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[58ca81b0-2e78-4800-8dfd-e214b1e73767]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.783 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbbc6d33e-c1 in ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.785 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbbc6d33e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.785 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ad42eff4-f98f-4526-92d1-b686cdd131be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.786 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[baa640ed-c885-403d-8f4a-fc6134142745]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 systemd-udevd[214916]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.799 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[18841617-b72f-4ef8-91ef-a40b179fd6df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 systemd-machined[153379]: New machine qemu-4-instance-00000004.
Dec 10 10:23:12 compute-0 NetworkManager[55541]: <info>  [1765362192.8030] device (tap43a5ac9c-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:23:12 compute-0 NetworkManager[55541]: <info>  [1765362192.8041] device (tap43a5ac9c-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.804 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:12 compute-0 ovn_controller[95452]: 2025-12-10T10:23:12Z|00070|binding|INFO|Setting lport 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 ovn-installed in OVS
Dec 10 10:23:12 compute-0 ovn_controller[95452]: 2025-12-10T10:23:12Z|00071|binding|INFO|Setting lport 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 up in Southbound
Dec 10 10:23:12 compute-0 nova_compute[186989]: 2025-12-10 10:23:12.810 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.813 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2c02e0-56dd-4f2a-8b2c-96a92663fbfd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.846 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[ca226040-1986-40f7-8112-655c08b497e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.850 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[93a92f5b-4f26-4e0c-abf3-81fbff7eefed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 NetworkManager[55541]: <info>  [1765362192.8518] manager: (tapbbc6d33e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Dec 10 10:23:12 compute-0 systemd-udevd[214920]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.882 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[d13c3135-b8dc-4a6a-9cec-210f49ced300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.886 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[a8bd1ce8-8588-4b06-9bbb-ecf2b2b8e9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 NetworkManager[55541]: <info>  [1765362192.9085] device (tapbbc6d33e-c0): carrier: link connected
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.912 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[29677181-e7fc-4ae4-a77e-39227b842499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.928 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5c63f264-ec78-4122-8a27-b570b0f2665f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbc6d33e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:db:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320812, 'reachable_time': 40487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214949, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.940 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9875221a-426b-48fd-9a54-561551d22849]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:db1f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320812, 'tstamp': 320812}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.954 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[e3315ea0-0098-41ae-ba2e-30873d388a7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbc6d33e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:db:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320812, 'reachable_time': 40487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214951, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:12.980 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f56a6d22-193d-4fa4-9d7c-2c1cba950cdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.000 186993 DEBUG nova.compute.manager [req-06266a13-23c9-461a-a985-d568ffc016a5 req-803e642b-15de-4adc-9137-79d9e7138922 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.001 186993 DEBUG oslo_concurrency.lockutils [req-06266a13-23c9-461a-a985-d568ffc016a5 req-803e642b-15de-4adc-9137-79d9e7138922 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.001 186993 DEBUG oslo_concurrency.lockutils [req-06266a13-23c9-461a-a985-d568ffc016a5 req-803e642b-15de-4adc-9137-79d9e7138922 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.001 186993 DEBUG oslo_concurrency.lockutils [req-06266a13-23c9-461a-a985-d568ffc016a5 req-803e642b-15de-4adc-9137-79d9e7138922 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.001 186993 DEBUG nova.compute.manager [req-06266a13-23c9-461a-a985-d568ffc016a5 req-803e642b-15de-4adc-9137-79d9e7138922 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Processing event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.037 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[97367021-12ab-40d0-ba25-a17679852d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.038 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc6d33e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.038 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.039 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbc6d33e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.041 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:13 compute-0 NetworkManager[55541]: <info>  [1765362193.0430] manager: (tapbbc6d33e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Dec 10 10:23:13 compute-0 kernel: tapbbc6d33e-c0: entered promiscuous mode
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.045 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.046 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbbc6d33e-c0, col_values=(('external_ids', {'iface-id': 'e91b42bc-2e07-4f57-b591-e1950e4a7c0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.047 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:13 compute-0 ovn_controller[95452]: 2025-12-10T10:23:13Z|00072|binding|INFO|Releasing lport e91b42bc-2e07-4f57-b591-e1950e4a7c0d from this chassis (sb_readonly=0)
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.071 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.073 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.074 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.075 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3f42eb2f-c846-4499-ae9c-1cfd8b09c757]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.076 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1.pid.haproxy
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:23:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:13.078 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'env', 'PROCESS_TAG=haproxy-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.462 186993 DEBUG nova.network.neutron [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updated VIF entry in instance network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.464 186993 DEBUG nova.network.neutron [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.482 186993 DEBUG oslo_concurrency.lockutils [req-fc4e14a7-2097-490a-80b9-fadb9cb3d031 req-de467b22-8aaa-41e6-914d-17094bf36e9a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:13 compute-0 podman[214983]: 2025-12-10 10:23:13.590255546 +0000 UTC m=+0.085484169 container create 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 10 10:23:13 compute-0 podman[214983]: 2025-12-10 10:23:13.542097938 +0000 UTC m=+0.037326641 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:23:13 compute-0 systemd[1]: Started libpod-conmon-6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e.scope.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.650 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362193.6497948, 247bc6b9-be00-4776-a195-f7a413953734 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.651 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] VM Started (Lifecycle Event)
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.654 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.658 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.662 186993 INFO nova.virt.libvirt.driver [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Instance spawned successfully.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.662 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.668 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:13 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.672 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:23:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47a785f88109bd706b3c3adc2f68ca527aa683bd8a9e25d2f209b7cc3d160b40/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.685 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.686 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.686 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.687 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.687 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.688 186993 DEBUG nova.virt.libvirt.driver [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:13 compute-0 podman[214983]: 2025-12-10 10:23:13.689216252 +0000 UTC m=+0.184444906 container init 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.693 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.694 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362193.651318, 247bc6b9-be00-4776-a195-f7a413953734 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.694 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] VM Paused (Lifecycle Event)
Dec 10 10:23:13 compute-0 podman[214983]: 2025-12-10 10:23:13.695486225 +0000 UTC m=+0.190714848 container start 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 10 10:23:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [NOTICE]   (215009) : New worker (215011) forked
Dec 10 10:23:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [NOTICE]   (215009) : Loading success.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.725 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.730 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362193.656836, 247bc6b9-be00-4776-a195-f7a413953734 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.730 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] VM Resumed (Lifecycle Event)
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.748 186993 INFO nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Took 5.60 seconds to spawn the instance on the hypervisor.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.749 186993 DEBUG nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.751 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.758 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.789 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.805 186993 INFO nova.compute.manager [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Took 6.06 seconds to build instance.
Dec 10 10:23:13 compute-0 nova_compute[186989]: 2025-12-10 10:23:13.819 186993 DEBUG oslo_concurrency.lockutils [None req-1b487c3e-0d49-4b48-82ff-c7a400e2331c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.142 186993 DEBUG nova.compute.manager [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.144 186993 DEBUG oslo_concurrency.lockutils [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.144 186993 DEBUG oslo_concurrency.lockutils [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.145 186993 DEBUG oslo_concurrency.lockutils [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.146 186993 DEBUG nova.compute.manager [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] No waiting events found dispatching network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:23:15 compute-0 nova_compute[186989]: 2025-12-10 10:23:15.146 186993 WARNING nova.compute.manager [req-f8b25d74-dcf1-4c97-9625-4a247dde4254 req-3459a407-9721-415f-b57a-98c37c8bfa6f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received unexpected event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 for instance with vm_state active and task_state None.
Dec 10 10:23:16 compute-0 podman[215020]: 2025-12-10 10:23:16.06394983 +0000 UTC m=+0.072847634 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:23:16 compute-0 nova_compute[186989]: 2025-12-10 10:23:16.926 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:17 compute-0 nova_compute[186989]: 2025-12-10 10:23:17.277 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:17 compute-0 nova_compute[186989]: 2025-12-10 10:23:17.729 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:17 compute-0 NetworkManager[55541]: <info>  [1765362197.7308] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 10 10:23:17 compute-0 NetworkManager[55541]: <info>  [1765362197.7329] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 10 10:23:17 compute-0 ovn_controller[95452]: 2025-12-10T10:23:17Z|00073|binding|INFO|Releasing lport e91b42bc-2e07-4f57-b591-e1950e4a7c0d from this chassis (sb_readonly=0)
Dec 10 10:23:17 compute-0 nova_compute[186989]: 2025-12-10 10:23:17.759 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:17 compute-0 ovn_controller[95452]: 2025-12-10T10:23:17Z|00074|binding|INFO|Releasing lport e91b42bc-2e07-4f57-b591-e1950e4a7c0d from this chassis (sb_readonly=0)
Dec 10 10:23:17 compute-0 nova_compute[186989]: 2025-12-10 10:23:17.762 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:18 compute-0 podman[215046]: 2025-12-10 10:23:18.018851323 +0000 UTC m=+0.057031431 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:23:18 compute-0 nova_compute[186989]: 2025-12-10 10:23:18.026 186993 DEBUG nova.compute.manager [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:18 compute-0 nova_compute[186989]: 2025-12-10 10:23:18.026 186993 DEBUG nova.compute.manager [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing instance network info cache due to event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:23:18 compute-0 nova_compute[186989]: 2025-12-10 10:23:18.027 186993 DEBUG oslo_concurrency.lockutils [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:18 compute-0 nova_compute[186989]: 2025-12-10 10:23:18.027 186993 DEBUG oslo_concurrency.lockutils [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:18 compute-0 nova_compute[186989]: 2025-12-10 10:23:18.028 186993 DEBUG nova.network.neutron [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:23:19 compute-0 nova_compute[186989]: 2025-12-10 10:23:19.624 186993 DEBUG nova.network.neutron [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updated VIF entry in instance network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:23:19 compute-0 nova_compute[186989]: 2025-12-10 10:23:19.625 186993 DEBUG nova.network.neutron [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:19 compute-0 nova_compute[186989]: 2025-12-10 10:23:19.646 186993 DEBUG oslo_concurrency.lockutils [req-d66ca6ff-cad3-4b4f-9cbd-04a99c4396d9 req-7ddff72c-331a-4a34-949e-371e47a9b168 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:21 compute-0 nova_compute[186989]: 2025-12-10 10:23:21.938 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:22 compute-0 nova_compute[186989]: 2025-12-10 10:23:22.284 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:24 compute-0 podman[215065]: 2025-12-10 10:23:24.047754544 +0000 UTC m=+0.074996833 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:23:24 compute-0 podman[215066]: 2025-12-10 10:23:24.058911999 +0000 UTC m=+0.083054794 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:23:24 compute-0 podman[215067]: 2025-12-10 10:23:24.09298653 +0000 UTC m=+0.108097777 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:23:26 compute-0 nova_compute[186989]: 2025-12-10 10:23:26.943 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:27 compute-0 ovn_controller[95452]: 2025-12-10T10:23:26Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:14:09 10.100.0.11
Dec 10 10:23:27 compute-0 ovn_controller[95452]: 2025-12-10T10:23:27Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:14:09 10.100.0.11
Dec 10 10:23:27 compute-0 nova_compute[186989]: 2025-12-10 10:23:27.286 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:31.464 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:31.465 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:31.466 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:31 compute-0 nova_compute[186989]: 2025-12-10 10:23:31.947 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:32 compute-0 nova_compute[186989]: 2025-12-10 10:23:32.287 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:32 compute-0 nova_compute[186989]: 2025-12-10 10:23:32.880 186993 INFO nova.compute.manager [None req-53a1e926-9f05-4a0f-beb6-0df676367b4d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Get console output
Dec 10 10:23:32 compute-0 nova_compute[186989]: 2025-12-10 10:23:32.888 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:23:33 compute-0 podman[215140]: 2025-12-10 10:23:33.021477986 +0000 UTC m=+0.069897163 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41)
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.586 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:35 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:35.588 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:23:35 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:35.589 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.605 186993 DEBUG nova.compute.manager [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.606 186993 DEBUG nova.compute.manager [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing instance network info cache due to event network-changed-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.606 186993 DEBUG oslo_concurrency.lockutils [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.607 186993 DEBUG oslo_concurrency.lockutils [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:35 compute-0 nova_compute[186989]: 2025-12-10 10:23:35.607 186993 DEBUG nova.network.neutron [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Refreshing network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:23:36 compute-0 podman[215161]: 2025-12-10 10:23:36.011209982 +0000 UTC m=+0.055811107 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:23:36 compute-0 nova_compute[186989]: 2025-12-10 10:23:36.868 186993 DEBUG nova.network.neutron [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updated VIF entry in instance network info cache for port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:23:36 compute-0 nova_compute[186989]: 2025-12-10 10:23:36.869 186993 DEBUG nova.network.neutron [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:36 compute-0 nova_compute[186989]: 2025-12-10 10:23:36.897 186993 DEBUG oslo_concurrency.lockutils [req-eeb3300d-a45c-4b6f-8dc2-73eafe35aef6 req-946f579f-74ff-4118-9680-c183a4069cc9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:36 compute-0 nova_compute[186989]: 2025-12-10 10:23:36.950 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:37 compute-0 nova_compute[186989]: 2025-12-10 10:23:37.290 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:37.591 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:41 compute-0 nova_compute[186989]: 2025-12-10 10:23:41.953 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:42 compute-0 nova_compute[186989]: 2025-12-10 10:23:42.347 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:43 compute-0 nova_compute[186989]: 2025-12-10 10:23:43.966 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:43 compute-0 nova_compute[186989]: 2025-12-10 10:23:43.967 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:43 compute-0 nova_compute[186989]: 2025-12-10 10:23:43.984 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.089 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.090 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.099 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.099 186993 INFO nova.compute.claims [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.307 186993 DEBUG nova.compute.provider_tree [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.323 186993 DEBUG nova.scheduler.client.report [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.357 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.358 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.414 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.415 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.442 186993 INFO nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.472 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.566 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.568 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.569 186993 INFO nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Creating image(s)
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.570 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.570 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.571 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.589 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.668 186993 DEBUG nova.policy [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.686 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.687 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.688 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.710 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.775 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.777 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.821 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.823 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.824 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.904 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.906 186993 DEBUG nova.virt.disk.api [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.907 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.985 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.987 186993 DEBUG nova.virt.disk.api [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:23:44 compute-0 nova_compute[186989]: 2025-12-10 10:23:44.988 186993 DEBUG nova.objects.instance [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 2b32429c-b1df-4be5-abed-5c314400a14a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.009 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.009 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Ensure instance console log exists: /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.010 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.010 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.010 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.214 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:45 compute-0 nova_compute[186989]: 2025-12-10 10:23:45.868 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Successfully created port: f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:23:46 compute-0 nova_compute[186989]: 2025-12-10 10:23:46.877 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Successfully updated port: f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:23:46 compute-0 nova_compute[186989]: 2025-12-10 10:23:46.904 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:46 compute-0 nova_compute[186989]: 2025-12-10 10:23:46.905 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:46 compute-0 nova_compute[186989]: 2025-12-10 10:23:46.905 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:23:46 compute-0 nova_compute[186989]: 2025-12-10 10:23:46.957 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.004 186993 DEBUG nova.compute.manager [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-changed-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.004 186993 DEBUG nova.compute.manager [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Refreshing instance network info cache due to event network-changed-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.005 186993 DEBUG oslo_concurrency.lockutils [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:47 compute-0 podman[215200]: 2025-12-10 10:23:47.030750953 +0000 UTC m=+0.068175121 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.349 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.390 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:47 compute-0 nova_compute[186989]: 2025-12-10 10:23:47.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.417 186993 DEBUG nova.network.neutron [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updating instance_info_cache with network_info: [{"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.766 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.767 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Instance network_info: |[{"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.768 186993 DEBUG oslo_concurrency.lockutils [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.769 186993 DEBUG nova.network.neutron [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Refreshing network info cache for port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.774 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Start _get_guest_xml network_info=[{"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.782 186993 WARNING nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.792 186993 DEBUG nova.virt.libvirt.host [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.793 186993 DEBUG nova.virt.libvirt.host [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.798 186993 DEBUG nova.virt.libvirt.host [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.799 186993 DEBUG nova.virt.libvirt.host [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.800 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.800 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.801 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.801 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.801 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.802 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.802 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.802 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.803 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.803 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.803 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.803 186993 DEBUG nova.virt.hardware [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.808 186993 DEBUG nova.virt.libvirt.vif [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:23:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1587201938',display_name='tempest-TestNetworkBasicOps-server-1587201938',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1587201938',id=5,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKnujjhvzFl0HNmkV8oR7NwUX9Lp+SscG0gnl0HEFPizw9ZzKBcwRuGLkUn+9Iw41otvi+zficm+MbR8+QqjdJSrw/vsTnCkyjso2wJXc4wt4lJcQJXum9lRd8bGN5JKeA==',key_name='tempest-TestNetworkBasicOps-1001037321',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-egc6bbcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:23:44Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=2b32429c-b1df-4be5-abed-5c314400a14a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.809 186993 DEBUG nova.network.os_vif_util [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.809 186993 DEBUG nova.network.os_vif_util [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.810 186993 DEBUG nova.objects.instance [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2b32429c-b1df-4be5-abed-5c314400a14a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.824 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <uuid>2b32429c-b1df-4be5-abed-5c314400a14a</uuid>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <name>instance-00000005</name>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1587201938</nova:name>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:23:48</nova:creationTime>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         <nova:port uuid="f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae">
Dec 10 10:23:48 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <system>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="serial">2b32429c-b1df-4be5-abed-5c314400a14a</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="uuid">2b32429c-b1df-4be5-abed-5c314400a14a</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </system>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <os>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </os>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <features>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </features>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.config"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:83:0a:19"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <target dev="tapf0ff5dc3-27"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/console.log" append="off"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <video>
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </video>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:23:48 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:23:48 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:23:48 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:23:48 compute-0 nova_compute[186989]: </domain>
Dec 10 10:23:48 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.826 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Preparing to wait for external event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.827 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.828 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.828 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.829 186993 DEBUG nova.virt.libvirt.vif [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:23:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1587201938',display_name='tempest-TestNetworkBasicOps-server-1587201938',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1587201938',id=5,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKnujjhvzFl0HNmkV8oR7NwUX9Lp+SscG0gnl0HEFPizw9ZzKBcwRuGLkUn+9Iw41otvi+zficm+MbR8+QqjdJSrw/vsTnCkyjso2wJXc4wt4lJcQJXum9lRd8bGN5JKeA==',key_name='tempest-TestNetworkBasicOps-1001037321',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-egc6bbcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:23:44Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=2b32429c-b1df-4be5-abed-5c314400a14a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.830 186993 DEBUG nova.network.os_vif_util [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.831 186993 DEBUG nova.network.os_vif_util [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.832 186993 DEBUG os_vif [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.833 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.834 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.835 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.840 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.840 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0ff5dc3-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.841 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf0ff5dc3-27, col_values=(('external_ids', {'iface-id': 'f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:0a:19', 'vm-uuid': '2b32429c-b1df-4be5-abed-5c314400a14a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:48 compute-0 NetworkManager[55541]: <info>  [1765362228.8452] manager: (tapf0ff5dc3-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.847 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.853 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.854 186993 INFO os_vif [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27')
Dec 10 10:23:48 compute-0 nova_compute[186989]: 2025-12-10 10:23:48.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:48 compute-0 podman[215228]: 2025-12-10 10:23:48.98091214 +0000 UTC m=+0.077939713 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.350 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.350 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.350 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.362 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.362 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.362 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:83:0a:19, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.363 186993 INFO nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Using config drive
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.369 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.514 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.515 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.515 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.515 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 247bc6b9-be00-4776-a195-f7a413953734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.784 186993 INFO nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Creating config drive at /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.config
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.789 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvbparqu7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.915 186993 DEBUG oslo_concurrency.processutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvbparqu7" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.972 186993 DEBUG nova.network.neutron [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updated VIF entry in instance network info cache for port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.974 186993 DEBUG nova.network.neutron [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updating instance_info_cache with network_info: [{"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:49 compute-0 kernel: tapf0ff5dc3-27: entered promiscuous mode
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.994 186993 DEBUG oslo_concurrency.lockutils [req-dd5d4264-040d-4bd9-b5c5-f512302dddf9 req-4b4fc944-2b7a-4651-a4b3-cb552be8cb46 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:49 compute-0 NetworkManager[55541]: <info>  [1765362229.9968] manager: (tapf0ff5dc3-27): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Dec 10 10:23:49 compute-0 nova_compute[186989]: 2025-12-10 10:23:49.996 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:49 compute-0 ovn_controller[95452]: 2025-12-10T10:23:49Z|00075|binding|INFO|Claiming lport f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae for this chassis.
Dec 10 10:23:49 compute-0 ovn_controller[95452]: 2025-12-10T10:23:49Z|00076|binding|INFO|f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae: Claiming fa:16:3e:83:0a:19 10.100.0.3
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.003 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0a:19 10.100.0.3'], port_security=['fa:16:3e:83:0a:19 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2b32429c-b1df-4be5-abed-5c314400a14a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '609e2476-7104-4149-9552-6eb5806caf35', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f15ac783-5514-42e3-9054-2f4e6c3b1dab, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.004 104302 INFO neutron.agent.ovn.metadata.agent [-] Port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae in datapath bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 bound to our chassis
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.005 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1
Dec 10 10:23:50 compute-0 ovn_controller[95452]: 2025-12-10T10:23:50Z|00077|binding|INFO|Setting lport f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae ovn-installed in OVS
Dec 10 10:23:50 compute-0 ovn_controller[95452]: 2025-12-10T10:23:50Z|00078|binding|INFO|Setting lport f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae up in Southbound
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.009 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.016 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.021 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[38cb3ede-3fee-46d6-a66c-1269e454b263]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 systemd-udevd[215263]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:23:50 compute-0 systemd-machined[153379]: New machine qemu-5-instance-00000005.
Dec 10 10:23:50 compute-0 NetworkManager[55541]: <info>  [1765362230.0429] device (tapf0ff5dc3-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:23:50 compute-0 NetworkManager[55541]: <info>  [1765362230.0435] device (tapf0ff5dc3-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:23:50 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.056 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[b1ecc212-81a5-40e5-af8a-b3e9e79d4daa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.062 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8dabf9-ff17-472d-abd9-64ef261d25eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.094 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[dacb1ee2-e3e9-44c0-ba46-2a7a12724008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.113 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5921b7e4-16c4-412c-b4a8-dd043f22d292]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbc6d33e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:db:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320812, 'reachable_time': 40487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215276, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.134 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d4df1ca6-483f-4d58-877a-fe26c08109e4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbbc6d33e-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320822, 'tstamp': 320822}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215278, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbbc6d33e-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320824, 'tstamp': 320824}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215278, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.136 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc6d33e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.138 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.139 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.140 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbc6d33e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.140 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.140 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbbc6d33e-c0, col_values=(('external_ids', {'iface-id': 'e91b42bc-2e07-4f57-b591-e1950e4a7c0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:23:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:23:50.140 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.250 186993 DEBUG nova.compute.manager [req-9cc04415-3da9-41c1-b8b5-e6761e9d7f49 req-91256593-f742-4a5d-8698-23f16f0c54ed 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.250 186993 DEBUG oslo_concurrency.lockutils [req-9cc04415-3da9-41c1-b8b5-e6761e9d7f49 req-91256593-f742-4a5d-8698-23f16f0c54ed 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.251 186993 DEBUG oslo_concurrency.lockutils [req-9cc04415-3da9-41c1-b8b5-e6761e9d7f49 req-91256593-f742-4a5d-8698-23f16f0c54ed 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.251 186993 DEBUG oslo_concurrency.lockutils [req-9cc04415-3da9-41c1-b8b5-e6761e9d7f49 req-91256593-f742-4a5d-8698-23f16f0c54ed 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.251 186993 DEBUG nova.compute.manager [req-9cc04415-3da9-41c1-b8b5-e6761e9d7f49 req-91256593-f742-4a5d-8698-23f16f0c54ed 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Processing event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.377 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.378 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362230.3770695, 2b32429c-b1df-4be5-abed-5c314400a14a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.378 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] VM Started (Lifecycle Event)
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.382 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.387 186993 INFO nova.virt.libvirt.driver [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Instance spawned successfully.
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.387 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.401 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.410 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.413 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.414 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.414 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.415 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.415 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.416 186993 DEBUG nova.virt.libvirt.driver [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.446 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.446 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362230.3782334, 2b32429c-b1df-4be5-abed-5c314400a14a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.447 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] VM Paused (Lifecycle Event)
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.482 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.486 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362230.3810315, 2b32429c-b1df-4be5-abed-5c314400a14a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.486 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] VM Resumed (Lifecycle Event)
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.493 186993 INFO nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Took 5.93 seconds to spawn the instance on the hypervisor.
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.494 186993 DEBUG nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.518 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.521 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.544 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.562 186993 INFO nova.compute.manager [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Took 6.50 seconds to build instance.
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.582 186993 DEBUG oslo_concurrency.lockutils [None req-9cf7f739-68f2-47f1-b5eb-4c0180f7350a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.776 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [{"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.790 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-247bc6b9-be00-4776-a195-f7a413953734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.791 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.792 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:50 compute-0 nova_compute[186989]: 2025-12-10 10:23:50.792 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.957 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.957 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.958 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:51 compute-0 nova_compute[186989]: 2025-12-10 10:23:51.958 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.044 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.118 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.120 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.184 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.191 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.251 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.252 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.320 186993 DEBUG nova.compute.manager [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.321 186993 DEBUG oslo_concurrency.lockutils [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.322 186993 DEBUG oslo_concurrency.lockutils [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.323 186993 DEBUG oslo_concurrency.lockutils [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.323 186993 DEBUG nova.compute.manager [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] No waiting events found dispatching network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.324 186993 WARNING nova.compute.manager [req-98c8997b-cc4d-4f1d-a941-94179a8cec77 req-52fa4fbd-0a1d-40f3-9383-159b2eefd69d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received unexpected event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae for instance with vm_state active and task_state None.
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.325 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.352 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.502 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.504 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5456MB free_disk=73.30051040649414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.505 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.505 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.584 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 247bc6b9-be00-4776-a195-f7a413953734 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.585 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 2b32429c-b1df-4be5-abed-5c314400a14a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.585 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.585 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.661 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.683 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.704 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:23:52 compute-0 nova_compute[186989]: 2025-12-10 10:23:52.705 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:23:53 compute-0 nova_compute[186989]: 2025-12-10 10:23:53.845 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:54 compute-0 nova_compute[186989]: 2025-12-10 10:23:54.401 186993 DEBUG nova.compute.manager [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-changed-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:23:54 compute-0 nova_compute[186989]: 2025-12-10 10:23:54.401 186993 DEBUG nova.compute.manager [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Refreshing instance network info cache due to event network-changed-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:23:54 compute-0 nova_compute[186989]: 2025-12-10 10:23:54.402 186993 DEBUG oslo_concurrency.lockutils [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:23:54 compute-0 nova_compute[186989]: 2025-12-10 10:23:54.402 186993 DEBUG oslo_concurrency.lockutils [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:23:54 compute-0 nova_compute[186989]: 2025-12-10 10:23:54.402 186993 DEBUG nova.network.neutron [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Refreshing network info cache for port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:23:55 compute-0 podman[215301]: 2025-12-10 10:23:55.047548881 +0000 UTC m=+0.072952050 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:23:55 compute-0 podman[215300]: 2025-12-10 10:23:55.050944742 +0000 UTC m=+0.076473844 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Dec 10 10:23:55 compute-0 podman[215302]: 2025-12-10 10:23:55.105096107 +0000 UTC m=+0.130519306 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 10 10:23:57 compute-0 nova_compute[186989]: 2025-12-10 10:23:57.399 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:23:57 compute-0 nova_compute[186989]: 2025-12-10 10:23:57.635 186993 DEBUG nova.network.neutron [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updated VIF entry in instance network info cache for port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:23:57 compute-0 nova_compute[186989]: 2025-12-10 10:23:57.636 186993 DEBUG nova.network.neutron [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updating instance_info_cache with network_info: [{"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:23:57 compute-0 nova_compute[186989]: 2025-12-10 10:23:57.656 186993 DEBUG oslo_concurrency.lockutils [req-1b0a91ab-8401-46f7-9d80-c608a2fe4d1e req-6762b800-640b-41a7-b2e8-0009bd66f1cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-2b32429c-b1df-4be5-abed-5c314400a14a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:23:58 compute-0 nova_compute[186989]: 2025-12-10 10:23:58.849 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:02 compute-0 nova_compute[186989]: 2025-12-10 10:24:02.401 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:03 compute-0 ovn_controller[95452]: 2025-12-10T10:24:03Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:0a:19 10.100.0.3
Dec 10 10:24:03 compute-0 ovn_controller[95452]: 2025-12-10T10:24:03Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:0a:19 10.100.0.3
Dec 10 10:24:03 compute-0 nova_compute[186989]: 2025-12-10 10:24:03.853 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:04 compute-0 podman[215384]: 2025-12-10 10:24:04.06403765 +0000 UTC m=+0.090440990 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 10 10:24:07 compute-0 podman[215404]: 2025-12-10 10:24:07.042400126 +0000 UTC m=+0.071925233 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:24:07 compute-0 nova_compute[186989]: 2025-12-10 10:24:07.404 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:08 compute-0 nova_compute[186989]: 2025-12-10 10:24:08.857 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.600 186993 INFO nova.compute.manager [None req-22ecc566-cbe7-4d58-babf-c7529c8c477e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Get console output
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.605 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.943 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.944 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.944 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.944 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.945 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.946 186993 INFO nova.compute.manager [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Terminating instance
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.948 186993 DEBUG nova.compute.manager [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:24:09 compute-0 kernel: tapf0ff5dc3-27 (unregistering): left promiscuous mode
Dec 10 10:24:09 compute-0 NetworkManager[55541]: <info>  [1765362249.9821] device (tapf0ff5dc3-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:24:09 compute-0 ovn_controller[95452]: 2025-12-10T10:24:09Z|00079|binding|INFO|Releasing lport f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae from this chassis (sb_readonly=0)
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.986 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:09 compute-0 ovn_controller[95452]: 2025-12-10T10:24:09Z|00080|binding|INFO|Setting lport f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae down in Southbound
Dec 10 10:24:09 compute-0 ovn_controller[95452]: 2025-12-10T10:24:09Z|00081|binding|INFO|Removing iface tapf0ff5dc3-27 ovn-installed in OVS
Dec 10 10:24:09 compute-0 nova_compute[186989]: 2025-12-10 10:24:09.989 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:09.995 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:0a:19 10.100.0.3'], port_security=['fa:16:3e:83:0a:19 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '2b32429c-b1df-4be5-abed-5c314400a14a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '609e2476-7104-4149-9552-6eb5806caf35', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f15ac783-5514-42e3-9054-2f4e6c3b1dab, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:24:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:09.996 104302 INFO neutron.agent.ovn.metadata.agent [-] Port f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae in datapath bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 unbound from our chassis
Dec 10 10:24:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:09.997 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.006 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.015 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8608c49d-59b7-4d4b-bfef-eb6c45f060b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 10 10:24:10 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 13.151s CPU time.
Dec 10 10:24:10 compute-0 systemd-machined[153379]: Machine qemu-5-instance-00000005 terminated.
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.053 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb2a27f-de40-48a2-b6f0-da35a3d93df0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.057 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[39e75e57-69cd-42ae-8c2b-9e427a8655ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.095 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bff514-2355-453b-b580-5e2a2b14b4ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.122 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ebefe14f-8320-4cf1-b327-5f85446d282a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbbc6d33e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:db:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320812, 'reachable_time': 40487, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215441, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.137 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0bba99-9176-45bb-b8f7-309c5dd1cc12]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbbc6d33e-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320822, 'tstamp': 320822}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215442, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapbbc6d33e-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320824, 'tstamp': 320824}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215442, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.139 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc6d33e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.142 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.147 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.148 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbbc6d33e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.149 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.150 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbbc6d33e-c0, col_values=(('external_ids', {'iface-id': 'e91b42bc-2e07-4f57-b591-e1950e4a7c0d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:10 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:10.151 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.224 186993 INFO nova.virt.libvirt.driver [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Instance destroyed successfully.
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.226 186993 DEBUG nova.objects.instance [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 2b32429c-b1df-4be5-abed-5c314400a14a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.244 186993 DEBUG nova.virt.libvirt.vif [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:23:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1587201938',display_name='tempest-TestNetworkBasicOps-server-1587201938',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1587201938',id=5,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKnujjhvzFl0HNmkV8oR7NwUX9Lp+SscG0gnl0HEFPizw9ZzKBcwRuGLkUn+9Iw41otvi+zficm+MbR8+QqjdJSrw/vsTnCkyjso2wJXc4wt4lJcQJXum9lRd8bGN5JKeA==',key_name='tempest-TestNetworkBasicOps-1001037321',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:23:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-egc6bbcb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:23:50Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=2b32429c-b1df-4be5-abed-5c314400a14a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.245 186993 DEBUG nova.network.os_vif_util [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "address": "fa:16:3e:83:0a:19", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf0ff5dc3-27", "ovs_interfaceid": "f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.245 186993 DEBUG nova.network.os_vif_util [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.246 186993 DEBUG os_vif [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.248 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.248 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0ff5dc3-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.294 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.296 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.300 186993 INFO os_vif [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:0a:19,bridge_name='br-int',has_traffic_filtering=True,id=f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf0ff5dc3-27')
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.301 186993 INFO nova.virt.libvirt.driver [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Deleting instance files /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a_del
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.302 186993 INFO nova.virt.libvirt.driver [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Deletion of /var/lib/nova/instances/2b32429c-b1df-4be5-abed-5c314400a14a_del complete
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.350 186993 INFO nova.compute.manager [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.351 186993 DEBUG oslo.service.loopingcall [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.351 186993 DEBUG nova.compute.manager [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.352 186993 DEBUG nova.network.neutron [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.702 186993 DEBUG nova.compute.manager [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-unplugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.703 186993 DEBUG oslo_concurrency.lockutils [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.703 186993 DEBUG oslo_concurrency.lockutils [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.703 186993 DEBUG oslo_concurrency.lockutils [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.704 186993 DEBUG nova.compute.manager [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] No waiting events found dispatching network-vif-unplugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:24:10 compute-0 nova_compute[186989]: 2025-12-10 10:24:10.704 186993 DEBUG nova.compute.manager [req-b5d4fef3-a451-4199-80f4-699f6ab03549 req-6b310ffc-77ed-4bd1-bcfd-15c26442f886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-unplugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.303 186993 DEBUG nova.network.neutron [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.322 186993 INFO nova.compute.manager [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Took 0.97 seconds to deallocate network for instance.
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.373 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.374 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.451 186993 DEBUG nova.compute.provider_tree [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.466 186993 DEBUG nova.scheduler.client.report [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.489 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.520 186993 INFO nova.scheduler.client.report [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 2b32429c-b1df-4be5-abed-5c314400a14a
Dec 10 10:24:11 compute-0 nova_compute[186989]: 2025-12-10 10:24:11.590 186993 DEBUG oslo_concurrency.lockutils [None req-49399bd7-3a4c-4853-b8b7-0dd2a2b971fa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.407 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.790 186993 DEBUG nova.compute.manager [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.790 186993 DEBUG oslo_concurrency.lockutils [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.791 186993 DEBUG oslo_concurrency.lockutils [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.791 186993 DEBUG oslo_concurrency.lockutils [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "2b32429c-b1df-4be5-abed-5c314400a14a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.791 186993 DEBUG nova.compute.manager [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] No waiting events found dispatching network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.791 186993 WARNING nova.compute.manager [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received unexpected event network-vif-plugged-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae for instance with vm_state deleted and task_state None.
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.791 186993 DEBUG nova.compute.manager [req-24c8b676-0508-4149-be0b-de4c0cafa0ed req-f426fb7a-9f20-4742-b564-761716ad64be 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Received event network-vif-deleted-f0ff5dc3-27a2-4b38-abe0-3ec8ea2910ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.847 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.848 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.849 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.849 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.849 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.851 186993 INFO nova.compute.manager [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Terminating instance
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.852 186993 DEBUG nova.compute.manager [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:24:12 compute-0 kernel: tap43a5ac9c-0f (unregistering): left promiscuous mode
Dec 10 10:24:12 compute-0 NetworkManager[55541]: <info>  [1765362252.8759] device (tap43a5ac9c-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.878 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:12 compute-0 ovn_controller[95452]: 2025-12-10T10:24:12Z|00082|binding|INFO|Releasing lport 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 from this chassis (sb_readonly=0)
Dec 10 10:24:12 compute-0 ovn_controller[95452]: 2025-12-10T10:24:12Z|00083|binding|INFO|Setting lport 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 down in Southbound
Dec 10 10:24:12 compute-0 ovn_controller[95452]: 2025-12-10T10:24:12Z|00084|binding|INFO|Removing iface tap43a5ac9c-0f ovn-installed in OVS
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.881 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:12.893 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:14:09 10.100.0.11'], port_security=['fa:16:3e:c1:14:09 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '247bc6b9-be00-4776-a195-f7a413953734', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3b98a055-b2ca-444f-b7fe-e5e3733253cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f15ac783-5514-42e3-9054-2f4e6c3b1dab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:24:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:12.894 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 in datapath bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 unbound from our chassis
Dec 10 10:24:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:12.895 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:24:12 compute-0 nova_compute[186989]: 2025-12-10 10:24:12.897 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:12.897 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c72e8ea4-72ac-443d-8aec-e03e83e2f25b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:12.897 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 namespace which is not needed anymore
Dec 10 10:24:12 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 10 10:24:12 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 15.319s CPU time.
Dec 10 10:24:12 compute-0 systemd-machined[153379]: Machine qemu-4-instance-00000004 terminated.
Dec 10 10:24:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [NOTICE]   (215009) : haproxy version is 2.8.14-c23fe91
Dec 10 10:24:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [NOTICE]   (215009) : path to executable is /usr/sbin/haproxy
Dec 10 10:24:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [WARNING]  (215009) : Exiting Master process...
Dec 10 10:24:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [ALERT]    (215009) : Current worker (215011) exited with code 143 (Terminated)
Dec 10 10:24:13 compute-0 neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1[215005]: [WARNING]  (215009) : All workers exited. Exiting... (0)
Dec 10 10:24:13 compute-0 systemd[1]: libpod-6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e.scope: Deactivated successfully.
Dec 10 10:24:13 compute-0 podman[215483]: 2025-12-10 10:24:13.069452845 +0000 UTC m=+0.055662706 container died 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 10 10:24:13 compute-0 NetworkManager[55541]: <info>  [1765362253.0731] manager: (tap43a5ac9c-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 10 10:24:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e-userdata-shm.mount: Deactivated successfully.
Dec 10 10:24:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-47a785f88109bd706b3c3adc2f68ca527aa683bd8a9e25d2f209b7cc3d160b40-merged.mount: Deactivated successfully.
Dec 10 10:24:13 compute-0 podman[215483]: 2025-12-10 10:24:13.109404148 +0000 UTC m=+0.095614019 container cleanup 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.130 186993 INFO nova.virt.libvirt.driver [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Instance destroyed successfully.
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.130 186993 DEBUG nova.objects.instance [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 247bc6b9-be00-4776-a195-f7a413953734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:24:13 compute-0 systemd[1]: libpod-conmon-6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e.scope: Deactivated successfully.
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.150 186993 DEBUG nova.virt.libvirt.vif [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:23:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1661384698',display_name='tempest-TestNetworkBasicOps-server-1661384698',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1661384698',id=4,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOoM2wxE/oQobBkwJW25+OIMBISiWlzYSSD+B1Ou1SGSJf8q1c5yTWdQRHu9jb1S94waK2J9TkgkUN7gAAqPQ0ctAKoNCthN3t+SBmA2139FZclCZYcu0m9TIbhiQgf1EA==',key_name='tempest-TestNetworkBasicOps-478102849',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:23:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-wy1scrod',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:23:13Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=247bc6b9-be00-4776-a195-f7a413953734,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.151 186993 DEBUG nova.network.os_vif_util [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "address": "fa:16:3e:c1:14:09", "network": {"id": "bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1", "bridge": "br-int", "label": "tempest-network-smoke--23874566", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43a5ac9c-0f", "ovs_interfaceid": "43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.152 186993 DEBUG nova.network.os_vif_util [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.152 186993 DEBUG os_vif [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.154 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.154 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43a5ac9c-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.157 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.158 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.160 186993 INFO os_vif [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:14:09,bridge_name='br-int',has_traffic_filtering=True,id=43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5,network=Network(bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43a5ac9c-0f')
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.161 186993 INFO nova.virt.libvirt.driver [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Deleting instance files /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734_del
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.164 186993 INFO nova.virt.libvirt.driver [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Deletion of /var/lib/nova/instances/247bc6b9-be00-4776-a195-f7a413953734_del complete
Dec 10 10:24:13 compute-0 podman[215524]: 2025-12-10 10:24:13.188448621 +0000 UTC m=+0.048220286 container remove 6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.194 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[02b76280-2cf0-4d60-9360-e8097780e129]: (4, ('Wed Dec 10 10:24:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 (6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e)\n6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e\nWed Dec 10 10:24:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 (6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e)\n6df8e346801206414898c45bbe4444c04c11bce036a905824dc164779b24332e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.197 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2d0886-f46b-4391-9797-d73c578198bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.197 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbbc6d33e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.199 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:13 compute-0 kernel: tapbbc6d33e-c0: left promiscuous mode
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.210 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.214 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[946414cb-73e0-4245-a022-40ebd4f28316]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.227 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a54e8fb8-c103-4d2b-9ffc-0510e9d21e62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.228 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ae562616-3004-47a3-ae85-353ed32e84a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.246 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5fadcba2-f23b-46e5-95a9-a62b1beb297c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320805, 'reachable_time': 26103, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215545, 'error': None, 'target': 'ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.249 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bbc6d33e-ce32-4b5f-91aa-06ff7d8daae1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:24:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:13.249 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[8b20b543-fd7a-4b50-9c88-5f944306e2b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:13 compute-0 systemd[1]: run-netns-ovnmeta\x2dbbc6d33e\x2dce32\x2d4b5f\x2d91aa\x2d06ff7d8daae1.mount: Deactivated successfully.
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.256 186993 INFO nova.compute.manager [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Took 0.40 seconds to destroy the instance on the hypervisor.
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.257 186993 DEBUG oslo.service.loopingcall [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.257 186993 DEBUG nova.compute.manager [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:24:13 compute-0 nova_compute[186989]: 2025-12-10 10:24:13.257 186993 DEBUG nova.network.neutron [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.490 186993 DEBUG nova.network.neutron [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.508 186993 INFO nova.compute.manager [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Took 1.25 seconds to deallocate network for instance.
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.565 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.565 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.615 186993 DEBUG nova.compute.provider_tree [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.640 186993 DEBUG nova.scheduler.client.report [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.662 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.685 186993 INFO nova.scheduler.client.report [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 247bc6b9-be00-4776-a195-f7a413953734
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.743 186993 DEBUG oslo_concurrency.lockutils [None req-6ede987a-365d-4d16-a621-8b53ce0c603c 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.872 186993 DEBUG nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-vif-unplugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.872 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.872 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.873 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.873 186993 DEBUG nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] No waiting events found dispatching network-vif-unplugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.873 186993 WARNING nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received unexpected event network-vif-unplugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 for instance with vm_state deleted and task_state None.
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.873 186993 DEBUG nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.873 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "247bc6b9-be00-4776-a195-f7a413953734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.874 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.874 186993 DEBUG oslo_concurrency.lockutils [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "247bc6b9-be00-4776-a195-f7a413953734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.874 186993 DEBUG nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] No waiting events found dispatching network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.874 186993 WARNING nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received unexpected event network-vif-plugged-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 for instance with vm_state deleted and task_state None.
Dec 10 10:24:14 compute-0 nova_compute[186989]: 2025-12-10 10:24:14.874 186993 DEBUG nova.compute.manager [req-9089bc19-95a3-4f03-9c78-7caaa40fea9c req-0960a782-6a73-4ca1-b7df-f12ffe49da70 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Received event network-vif-deleted-43a5ac9c-0fff-4cfe-82cf-83d2b2b58bf5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:17 compute-0 nova_compute[186989]: 2025-12-10 10:24:17.411 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:18 compute-0 podman[215546]: 2025-12-10 10:24:18.02935963 +0000 UTC m=+0.072910099 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:24:18 compute-0 nova_compute[186989]: 2025-12-10 10:24:18.158 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:19 compute-0 nova_compute[186989]: 2025-12-10 10:24:19.044 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:19 compute-0 nova_compute[186989]: 2025-12-10 10:24:19.111 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:20 compute-0 podman[215571]: 2025-12-10 10:24:20.025755291 +0000 UTC m=+0.071204973 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:24:22 compute-0 nova_compute[186989]: 2025-12-10 10:24:22.414 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:23 compute-0 nova_compute[186989]: 2025-12-10 10:24:23.160 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:24 compute-0 sshd-session[215592]: error: kex_exchange_identification: read: Connection reset by peer
Dec 10 10:24:24 compute-0 sshd-session[215592]: Connection reset by 45.140.17.97 port 51738
Dec 10 10:24:25 compute-0 nova_compute[186989]: 2025-12-10 10:24:25.223 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362250.2224119, 2b32429c-b1df-4be5-abed-5c314400a14a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:24:25 compute-0 nova_compute[186989]: 2025-12-10 10:24:25.224 186993 INFO nova.compute.manager [-] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] VM Stopped (Lifecycle Event)
Dec 10 10:24:25 compute-0 nova_compute[186989]: 2025-12-10 10:24:25.252 186993 DEBUG nova.compute.manager [None req-d1a16fcb-657c-4488-a3ca-d5ef217481d0 - - - - - -] [instance: 2b32429c-b1df-4be5-abed-5c314400a14a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:26 compute-0 podman[215594]: 2025-12-10 10:24:26.033682215 +0000 UTC m=+0.070062003 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 10 10:24:26 compute-0 podman[215593]: 2025-12-10 10:24:26.036257624 +0000 UTC m=+0.074131282 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:24:26 compute-0 podman[215595]: 2025-12-10 10:24:26.086567366 +0000 UTC m=+0.120317563 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:24:27 compute-0 nova_compute[186989]: 2025-12-10 10:24:27.445 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:28 compute-0 nova_compute[186989]: 2025-12-10 10:24:28.124 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362253.1225336, 247bc6b9-be00-4776-a195-f7a413953734 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:24:28 compute-0 nova_compute[186989]: 2025-12-10 10:24:28.124 186993 INFO nova.compute.manager [-] [instance: 247bc6b9-be00-4776-a195-f7a413953734] VM Stopped (Lifecycle Event)
Dec 10 10:24:28 compute-0 nova_compute[186989]: 2025-12-10 10:24:28.143 186993 DEBUG nova.compute.manager [None req-837f04c2-09e6-47b6-8ceb-d0725621700d - - - - - -] [instance: 247bc6b9-be00-4776-a195-f7a413953734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:28 compute-0 nova_compute[186989]: 2025-12-10 10:24:28.163 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:31.466 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:31.466 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:31.466 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.761 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.762 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.779 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.917 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.917 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.925 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:24:31 compute-0 nova_compute[186989]: 2025-12-10 10:24:31.926 186993 INFO nova.compute.claims [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.038 186993 DEBUG nova.compute.provider_tree [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.052 186993 DEBUG nova.scheduler.client.report [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.069 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.070 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.110 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.111 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.129 186993 INFO nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.145 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.226 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.228 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.228 186993 INFO nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Creating image(s)
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.229 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.229 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.230 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.241 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.271 186993 DEBUG nova.policy [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.313 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.314 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.315 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.326 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.411 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.412 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.448 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.454 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.454 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.455 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.514 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.515 186993 DEBUG nova.virt.disk.api [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.515 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.572 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.573 186993 DEBUG nova.virt.disk.api [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.574 186993 DEBUG nova.objects.instance [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.590 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.590 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Ensure instance console log exists: /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.591 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.591 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:32 compute-0 nova_compute[186989]: 2025-12-10 10:24:32.592 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:33 compute-0 nova_compute[186989]: 2025-12-10 10:24:33.065 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Successfully created port: 507bf448-94f2-4c23-86a4-a13b31717ff8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:24:33 compute-0 nova_compute[186989]: 2025-12-10 10:24:33.165 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:35 compute-0 podman[215674]: 2025-12-10 10:24:35.033984737 +0000 UTC m=+0.077907464 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 10 10:24:37 compute-0 nova_compute[186989]: 2025-12-10 10:24:37.451 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:38 compute-0 podman[215695]: 2025-12-10 10:24:38.028153435 +0000 UTC m=+0.073210607 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:24:38 compute-0 nova_compute[186989]: 2025-12-10 10:24:38.167 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.443 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Successfully updated port: 507bf448-94f2-4c23-86a4-a13b31717ff8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.468 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.469 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.469 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.549 186993 DEBUG nova.compute.manager [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.549 186993 DEBUG nova.compute.manager [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing instance network info cache due to event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:24:39 compute-0 nova_compute[186989]: 2025-12-10 10:24:39.549 186993 DEBUG oslo_concurrency.lockutils [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:24:40 compute-0 nova_compute[186989]: 2025-12-10 10:24:40.418 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.452 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.591 186993 DEBUG nova.network.neutron [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.617 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.618 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Instance network_info: |[{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.619 186993 DEBUG oslo_concurrency.lockutils [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.619 186993 DEBUG nova.network.neutron [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.622 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Start _get_guest_xml network_info=[{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.629 186993 WARNING nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.641 186993 DEBUG nova.virt.libvirt.host [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.641 186993 DEBUG nova.virt.libvirt.host [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.645 186993 DEBUG nova.virt.libvirt.host [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.646 186993 DEBUG nova.virt.libvirt.host [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.646 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.647 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.647 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.648 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.648 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.648 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.648 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.649 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.649 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.649 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.650 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.650 186993 DEBUG nova.virt.hardware [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.654 186993 DEBUG nova.virt.libvirt.vif [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:24:32Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.655 186993 DEBUG nova.network.os_vif_util [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.656 186993 DEBUG nova.network.os_vif_util [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.657 186993 DEBUG nova.objects.instance [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.677 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <uuid>77bc78a9-08a2-448f-b9c0-cfd055940b6b</uuid>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <name>instance-00000006</name>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:24:42</nova:creationTime>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:24:42 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <system>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="serial">77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="uuid">77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </system>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <os>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </os>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <features>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </features>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:89:74:34"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <target dev="tap507bf448-94"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log" append="off"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <video>
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </video>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:24:42 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:24:42 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:24:42 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:24:42 compute-0 nova_compute[186989]: </domain>
Dec 10 10:24:42 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.679 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Preparing to wait for external event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.680 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.680 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.680 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.681 186993 DEBUG nova.virt.libvirt.vif [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:24:32Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.681 186993 DEBUG nova.network.os_vif_util [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.682 186993 DEBUG nova.network.os_vif_util [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.683 186993 DEBUG os_vif [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.683 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.684 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.684 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.688 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.689 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap507bf448-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.689 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap507bf448-94, col_values=(('external_ids', {'iface-id': '507bf448-94f2-4c23-86a4-a13b31717ff8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:74:34', 'vm-uuid': '77bc78a9-08a2-448f-b9c0-cfd055940b6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.691 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:42 compute-0 NetworkManager[55541]: <info>  [1765362282.6920] manager: (tap507bf448-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.693 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.703 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.704 186993 INFO os_vif [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94')
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.756 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.756 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.756 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:89:74:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:24:42 compute-0 nova_compute[186989]: 2025-12-10 10:24:42.757 186993 INFO nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Using config drive
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.421 186993 INFO nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Creating config drive at /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.426 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp78_vwmdh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.553 186993 DEBUG oslo_concurrency.processutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp78_vwmdh" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:44 compute-0 kernel: tap507bf448-94: entered promiscuous mode
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.6250] manager: (tap507bf448-94): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec 10 10:24:44 compute-0 ovn_controller[95452]: 2025-12-10T10:24:44Z|00085|binding|INFO|Claiming lport 507bf448-94f2-4c23-86a4-a13b31717ff8 for this chassis.
Dec 10 10:24:44 compute-0 ovn_controller[95452]: 2025-12-10T10:24:44Z|00086|binding|INFO|507bf448-94f2-4c23-86a4-a13b31717ff8: Claiming fa:16:3e:89:74:34 10.100.0.14
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.625 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.632 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.639 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:74:34 10.100.0.14'], port_security=['fa:16:3e:89:74:34 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16c8959b-0f9c-462b-981f-7320145346f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '94e10544-6f2b-462a-accb-9b6e66b1904b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1c942b-2467-4df4-bbeb-865ba1260aad, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=507bf448-94f2-4c23-86a4-a13b31717ff8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.641 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 507bf448-94f2-4c23-86a4-a13b31717ff8 in datapath 16c8959b-0f9c-462b-981f-7320145346f8 bound to our chassis
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.643 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16c8959b-0f9c-462b-981f-7320145346f8
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.656 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5b98167c-8af7-4939-9716-0cef7bd192fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.658 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16c8959b-01 in ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:24:44 compute-0 systemd-udevd[215740]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.660 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16c8959b-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.660 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[38ac015c-3978-4856-882d-8cb0a8752658]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.661 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8175e837-01e0-43e2-92b4-a87e702d90a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 systemd-machined[153379]: New machine qemu-6-instance-00000006.
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.6758] device (tap507bf448-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.6766] device (tap507bf448-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.676 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[04058258-f351-43e6-835e-0c3dac56cc8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.684 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_controller[95452]: 2025-12-10T10:24:44Z|00087|binding|INFO|Setting lport 507bf448-94f2-4c23-86a4-a13b31717ff8 ovn-installed in OVS
Dec 10 10:24:44 compute-0 ovn_controller[95452]: 2025-12-10T10:24:44Z|00088|binding|INFO|Setting lport 507bf448-94f2-4c23-86a4-a13b31717ff8 up in Southbound
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.692 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.694 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[718b48bf-1bd1-4ebd-b748-79e0826154e9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.721 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc24bc3-7b9b-498e-ba5b-11f942b8cea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 systemd-udevd[215744]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.7294] manager: (tap16c8959b-00): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.727 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f7237414-b0dd-48c9-8162-37544b67b3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.765 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[528857a6-9512-46a2-a4cd-26434465600a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.770 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a9eedf-edc2-4756-8a4a-fd07c7e8e866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.8026] device (tap16c8959b-00): carrier: link connected
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.809 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[14b8bb3f-7092-4cff-9fbc-ecb2397bfa03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.828 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[85867f05-39ed-46a3-9244-bb94794d9787]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16c8959b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:69:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 330002, 'reachable_time': 22095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215773, 'error': None, 'target': 'ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.846 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[2bad5799-35da-4c34-9944-72f6cce5b7d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe51:6928'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 330002, 'tstamp': 330002}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215774, 'error': None, 'target': 'ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.864 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb22547-1845-41a6-8d9c-d8e7b9b73ef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16c8959b-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:69:28'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 330002, 'reachable_time': 22095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215775, 'error': None, 'target': 'ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.901 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[912e6714-8299-4b77-aaf8-2b13e27d932b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.949 186993 DEBUG nova.compute.manager [req-ff9aab8b-7abb-47d7-a462-4bc4aa439856 req-4cd2daec-5bda-4229-9785-8c2eb59ee841 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.950 186993 DEBUG oslo_concurrency.lockutils [req-ff9aab8b-7abb-47d7-a462-4bc4aa439856 req-4cd2daec-5bda-4229-9785-8c2eb59ee841 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.950 186993 DEBUG oslo_concurrency.lockutils [req-ff9aab8b-7abb-47d7-a462-4bc4aa439856 req-4cd2daec-5bda-4229-9785-8c2eb59ee841 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.950 186993 DEBUG oslo_concurrency.lockutils [req-ff9aab8b-7abb-47d7-a462-4bc4aa439856 req-4cd2daec-5bda-4229-9785-8c2eb59ee841 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.950 186993 DEBUG nova.compute.manager [req-ff9aab8b-7abb-47d7-a462-4bc4aa439856 req-4cd2daec-5bda-4229-9785-8c2eb59ee841 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Processing event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.974 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5dc0e5-efb5-4af6-bea6-015ef1cf0ab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.976 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16c8959b-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.976 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.976 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16c8959b-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.978 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 NetworkManager[55541]: <info>  [1765362284.9793] manager: (tap16c8959b-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec 10 10:24:44 compute-0 kernel: tap16c8959b-00: entered promiscuous mode
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.980 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.981 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16c8959b-00, col_values=(('external_ids', {'iface-id': 'd89a5400-4042-4e9f-87fc-cd18de8a733b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.982 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_controller[95452]: 2025-12-10T10:24:44Z|00089|binding|INFO|Releasing lport d89a5400-4042-4e9f-87fc-cd18de8a733b from this chassis (sb_readonly=0)
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.983 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.984 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16c8959b-0f9c-462b-981f-7320145346f8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16c8959b-0f9c-462b-981f-7320145346f8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.985 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[b819c50a-170f-453c-a2a5-35beb03a8eb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.987 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-16c8959b-0f9c-462b-981f-7320145346f8
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/16c8959b-0f9c-462b-981f-7320145346f8.pid.haproxy
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 16c8959b-0f9c-462b-981f-7320145346f8
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:24:44 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:44.988 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8', 'env', 'PROCESS_TAG=haproxy-16c8959b-0f9c-462b-981f-7320145346f8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16c8959b-0f9c-462b-981f-7320145346f8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:24:44 compute-0 nova_compute[186989]: 2025-12-10 10:24:44.995 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:45 compute-0 podman[215807]: 2025-12-10 10:24:45.418203979 +0000 UTC m=+0.107265491 container create 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.425 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362285.4250062, 77bc78a9-08a2-448f-b9c0-cfd055940b6b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.427 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] VM Started (Lifecycle Event)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.428 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'name': 'tempest-TestNetworkBasicOps-server-1350364126', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '82da19f85bb840d2a70395c3d761ef38', 'user_id': '603f9c3a99e145e4a64248329321a249', 'hostId': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.432 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.434 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 77bc78a9-08a2-448f-b9c0-cfd055940b6b / tap507bf448-94 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 10 10:24:45 compute-0 podman[215807]: 2025-12-10 10:24:45.338042377 +0000 UTC m=+0.027103909 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.434 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.436 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e45331f-d270-426e-9189-a3beebaf935a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.430286', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72bcbca8-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'e059173e8fde1d8a1966839c4c64853682f253ae1f4445e28705fc932a50fb9d'}]}, 'timestamp': '2025-12-10 10:24:45.435439', '_unique_id': '0b6a125a2d6343309c947c419bb9881c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.437 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.438 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.442 186993 INFO nova.virt.libvirt.driver [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Instance spawned successfully.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.443 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.452 186993 DEBUG nova.network.neutron [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated VIF entry in instance network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.453 186993 DEBUG nova.network.neutron [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.474 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.476 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 systemd[1]: Started libpod-conmon-5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf.scope.
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9c4fe87-3318-43e9-93c7-5866906b200d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.438935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72c2eda8-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '28aa1aaa6da0725c5a1ac23cc8ed0ebe73c21f6d11693315af50842e6f6f5b0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.438935', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72c2fe24-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': 'df13cf421c56096c528a7b99656053ad0b21a8c08b5c1ef0614655d62fdc5371'}]}, 'timestamp': '2025-12-10 10:24:45.476340', '_unique_id': '1196a0c7aafb42bcab087a2658727282'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.477 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.478 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.489 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.490 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5b35ba2-09d5-451a-9a40-ab03729271c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.479185', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72c52000-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': 'ea8d6310e27be1d679eaced5466e0cab509427eff8d263c24f6e22429eb7e0cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.479185', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72c52d02-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': '882fbf726e0657c91735e0d1c697be3ec22fb13b85e54013985bf46b53ffe80c'}]}, 'timestamp': '2025-12-10 10:24:45.490635', '_unique_id': 'a832fe68a0574ece87cef0e141cfd977'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.492 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.493 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.493 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.494 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3902be44-d316-427f-b9c1-e3dd26b270ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.493343', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72c5a35e-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': '61e662012bba4fbe7a916cf327eb7787984efe2eb91159e2ff471d66012a945d'}]}, 'timestamp': '2025-12-10 10:24:45.493696', '_unique_id': '0cd328aaab9840e5975b503a73b50097'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.494 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.495 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.495 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.495 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e94bf48-149e-4721-ac57-e1b8b6579a01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.495396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72c5f296-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': 'e9dd5e924500d57a6971577b8b63b704fe08426414cf2d721af4a1f2a64c9659'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.495396', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72c5ff5c-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '202f011b04d4d91f596da8dda808e65f19c91f6d9f58f5998d772897ba8be3cc'}]}, 'timestamp': '2025-12-10 10:24:45.496043', '_unique_id': '706a229548bf4fa6819254b963f46d89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.496 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.497 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.497 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>]
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.498 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.498 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.498 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc9dbe59-6595-4d72-92be-40d8bd77037b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.498184', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72c65fe2-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '7c61ced3f5dffd59142c8af105e43e077014accba8d28e5ac993cc21d88beb0c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.498184', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72c66bae-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '2ae1be6f022a462897cb0dbebc51fe9b80f4061d85b8e3fc91bbc2e029512099'}]}, 'timestamp': '2025-12-10 10:24:45.498827', '_unique_id': '39842da98b054abdaaa8661467f36218'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.499 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.502 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.503 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.503 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aef2c0e0-5efa-47c9-863f-19c6cac6250b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.502998', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72c72436-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': 'b1b9655fb67164a82c4373c21c7e3f34d97a5410cd986eda535f8a88dbeb2d8e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.502998', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72c73bce-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': '7be6fb537b3c26e9b53b1cd38bd41e2095cbd4bb86a93b34316f82840abc92dc'}]}, 'timestamp': '2025-12-10 10:24:45.504240', '_unique_id': '0e09a7b36cf34524853d51e73df2e04e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.505 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6919f003-0d3b-43d7-913f-2809817d92bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.505819', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72c78a5c-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'd26250d3edec3ab2570977e4ef6a6fa2d5e8fb73bd8399157788a969e0e7225f'}]}, 'timestamp': '2025-12-10 10:24:45.506164', '_unique_id': 'cdb0a184eed149e48d558f0e98566da7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.507 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 10 10:24:45 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.514 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.515 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.517 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.518 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/781f5217a91584c29cfb7658b131453b156889bdc34ea22561100e71d0940656/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.518 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.519 186993 DEBUG nova.virt.libvirt.driver [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.526 186993 DEBUG oslo_concurrency.lockutils [req-1d03c9b0-bbe3-42dd-a2ee-dd1de4dea75b req-7217094f-903b-4c6e-8508-412659c45872 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.526 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.526 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362285.4252613, 77bc78a9-08a2-448f-b9c0-cfd055940b6b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.527 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] VM Paused (Lifecycle Event)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.538 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.538 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b: ceilometer.compute.pollsters.NoVolumeException
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.538 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.538 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9008bc1b-8344-4263-b596-36fbe75313b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.538711', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72cc9312-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': '0e0029459e181ef5ff86bc72a5e12e4c76e55a7e730757bed7d3eda667fce26c'}]}, 'timestamp': '2025-12-10 10:24:45.539197', '_unique_id': '8594eb80f62640a890d2f8909061c551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.540 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.541 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.541 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/cpu volume: 60000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff368f85-3b42-4ee6-8ede-3484441887ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60000000, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'timestamp': '2025-12-10T10:24:45.541186', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '72cced58-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.811353367, 'message_signature': 'fe88dd7ddca707580a528a9df68561220dbb177688a07ebdbde5c5f0297dc337'}]}, 'timestamp': '2025-12-10 10:24:45.541445', '_unique_id': 'd940a04fa3bb4e96bc1f087dab4b31a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.542 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>]
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.543 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.543 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.543 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93b887d0-74f6-4429-ac1c-7ad749741175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.543093', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72cd39de-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '37f24aee41a081dac83bd9f53a0d2c9c5c24cf3039463e629dfac6b43e0afdb4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.543093', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72cd4474-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '699f1d92a7fc2b42e5e12378d3da0c15265aad6cf14987a95acc206d263ad14d'}]}, 'timestamp': '2025-12-10 10:24:45.543657', '_unique_id': '7eaed933bcf04f048f007472e360f8fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.544 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44eb4c81-33c2-491b-8a37-d7bf135a5c27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.544835', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72cd7be2-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'f3a171e1a35d56018e407fcf4a0591402e8d18562ea57942ea6668b6c397b135'}]}, 'timestamp': '2025-12-10 10:24:45.545068', '_unique_id': 'ec7642acdca443309fa3e578312bb6ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 podman[215807]: 2025-12-10 10:24:45.546352762 +0000 UTC m=+0.235414304 container init 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.545 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.546 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.546 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.546 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d191d0d-26d5-4be9-be50-9d39dbac669d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.546283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72cdb490-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '34bd4222d3d13c586867b2df13220de2c3db0e9381e2ab1740bc7f5122303c1e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.546283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72cdbcb0-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '5949f4b28b7ea68722b10ef21963035d46b8b9914ff1df59e0c50157a3e5927c'}]}, 'timestamp': '2025-12-10 10:24:45.546768', '_unique_id': 'dce8919da2a848b1941beba452d8a3d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.547 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.548 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbce8006-5497-4e6c-ac40-dd5b326c3f35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.548281', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72ce0292-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': '27f6bfd6c93f80c72757358588948f99c5b7efd1901b29d5380dfdd36854e188'}]}, 'timestamp': '2025-12-10 10:24:45.548518', '_unique_id': '29dd08472fb04579b80ba12b84d59ba8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.549 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>]
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1caeaf03-b91e-4cfd-94bc-0de2ec9860fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.550166', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72ce4c34-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'a4f174c66a8d4a9b01c66cd9b5415ea293a3d40c41de9b454763d6c6fb1bac64'}]}, 'timestamp': '2025-12-10 10:24:45.550402', '_unique_id': 'ef36b268ef0d4c649d45c4b0999ed9a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.550 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.551 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 podman[215807]: 2025-12-10 10:24:45.552772645 +0000 UTC m=+0.241834157 container start 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '514e1f63-549c-469a-ac53-29d53dfdd784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.551636', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72ce8690-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'aaef761f35cd0225a6bf2f32f67049d726f9ca1fe423f414e55cdc48c98da156'}]}, 'timestamp': '2025-12-10 10:24:45.551895', '_unique_id': 'd6ab4b79155e4e9688c38a2cb593fd8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.552 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.553 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.553 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d5554d1-6b71-4b4c-bc75-27f4b8de601e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.553079', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72cebdc2-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '3acd0dc2dfd747335ce5a95a06ba08851ee1e2ba6a280e8ad52398d224bbec91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.553079', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72cec574-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.712286446, 'message_signature': '003065e17269b5fb0f3f47cb8ee60606dc82f95680fc62d99050f855b5373c66'}]}, 'timestamp': '2025-12-10 10:24:45.553488', '_unique_id': 'ec144f2cafba46349a90dc43603edde6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.554 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae0cd679-d5ec-44f5-bad4-9313c0785f71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.554654', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72cefc42-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': 'eb14176ca4c0c0f134530102da8d13afe279eb0da0adbfbb509242f9339e36ba'}]}, 'timestamp': '2025-12-10 10:24:45.554907', '_unique_id': 'a85d5b6c929242ada449962bb3e5cbbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.555 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5f71852-d98d-4242-bd8e-78ca77fd01e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-00000006-77bc78a9-08a2-448f-b9c0-cfd055940b6b-tap507bf448-94', 'timestamp': '2025-12-10T10:24:45.556166', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'tap507bf448-94', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:74:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap507bf448-94'}, 'message_id': '72cf3658-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.703723316, 'message_signature': '32a658c5063303cc87920748893bb4701a561e3988ce6278dbb4ffbb5e43d196'}]}, 'timestamp': '2025-12-10 10:24:45.556393', '_unique_id': '6d299e17668749ef95145656f67b52a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.556 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.557 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.557 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.557 12 DEBUG ceilometer.compute.pollsters [-] 77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '956caf9e-6b33-4a3c-ac32-870c82e62a3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-vda', 'timestamp': '2025-12-10T10:24:45.557551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '72cf6c54-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': '6f1c7404ef4b7c510d65a6d26fd6b5d2e068ec8c8bb729754c92b86750a6ddf0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b-sda', 'timestamp': '2025-12-10T10:24:45.557551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1350364126', 'name': 'instance-00000006', 'instance_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '72cf74f6-d5b2-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3300.752610659, 'message_signature': '09c292d075812874d0574a194a85c912fca00fe41eee66078e6dc95e0c5bd068'}]}, 'timestamp': '2025-12-10 10:24:45.557981', '_unique_id': '7909e05aa32b4bfc99787175779a6e7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.559 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:24:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:24:45.559 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1350364126>]
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.574 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:45 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [NOTICE]   (215833) : New worker (215835) forked
Dec 10 10:24:45 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [NOTICE]   (215833) : Loading success.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.579 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362285.4354284, 77bc78a9-08a2-448f-b9c0-cfd055940b6b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.579 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] VM Resumed (Lifecycle Event)
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.725 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.729 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.765 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.784 186993 INFO nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Took 13.56 seconds to spawn the instance on the hypervisor.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.784 186993 DEBUG nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.840 186993 INFO nova.compute.manager [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Took 13.96 seconds to build instance.
Dec 10 10:24:45 compute-0 nova_compute[186989]: 2025-12-10 10:24:45.856 186993 DEBUG oslo_concurrency.lockutils [None req-aff978f6-1fbf-4015-ace7-1691f1df68d9 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:46 compute-0 nova_compute[186989]: 2025-12-10 10:24:46.704 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.035 186993 DEBUG nova.compute.manager [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.036 186993 DEBUG oslo_concurrency.lockutils [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.036 186993 DEBUG oslo_concurrency.lockutils [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.037 186993 DEBUG oslo_concurrency.lockutils [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.037 186993 DEBUG nova.compute.manager [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.037 186993 WARNING nova.compute.manager [req-6d12ea8e-7e45-41a5-8128-59d1a99887f6 req-8607de21-8b2e-4077-86a8-8d87d4c8e9c3 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 for instance with vm_state active and task_state None.
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.455 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:47 compute-0 nova_compute[186989]: 2025-12-10 10:24:47.691 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:48 compute-0 nova_compute[186989]: 2025-12-10 10:24:48.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:48 compute-0 nova_compute[186989]: 2025-12-10 10:24:48.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:24:49 compute-0 podman[215844]: 2025-12-10 10:24:49.021970172 +0000 UTC m=+0.059873059 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:24:49 compute-0 nova_compute[186989]: 2025-12-10 10:24:49.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:49 compute-0 nova_compute[186989]: 2025-12-10 10:24:49.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:50.633 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:24:50 compute-0 nova_compute[186989]: 2025-12-10 10:24:50.634 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:50.635 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:24:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:24:50.637 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:24:50 compute-0 nova_compute[186989]: 2025-12-10 10:24:50.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:50 compute-0 nova_compute[186989]: 2025-12-10 10:24:50.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:24:50 compute-0 nova_compute[186989]: 2025-12-10 10:24:50.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:24:51 compute-0 podman[215869]: 2025-12-10 10:24:51.028883765 +0000 UTC m=+0.061487162 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.449 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.450 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.450 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.450 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.894 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:51 compute-0 NetworkManager[55541]: <info>  [1765362291.8958] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec 10 10:24:51 compute-0 NetworkManager[55541]: <info>  [1765362291.8978] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 10 10:24:51 compute-0 ovn_controller[95452]: 2025-12-10T10:24:51Z|00090|binding|INFO|Releasing lport d89a5400-4042-4e9f-87fc-cd18de8a733b from this chassis (sb_readonly=0)
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.944 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:51 compute-0 ovn_controller[95452]: 2025-12-10T10:24:51Z|00091|binding|INFO|Releasing lport d89a5400-4042-4e9f-87fc-cd18de8a733b from this chassis (sb_readonly=0)
Dec 10 10:24:51 compute-0 nova_compute[186989]: 2025-12-10 10:24:51.949 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.348 186993 DEBUG nova.compute.manager [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.349 186993 DEBUG nova.compute.manager [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing instance network info cache due to event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.349 186993 DEBUG oslo_concurrency.lockutils [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.456 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.693 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.775 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.795 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.795 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.795 186993 DEBUG oslo_concurrency.lockutils [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.796 186993 DEBUG nova.network.neutron [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.798 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.798 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.824 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.825 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.825 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.826 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:24:52 compute-0 nova_compute[186989]: 2025-12-10 10:24:52.925 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.006 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.008 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.068 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.234 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.236 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5609MB free_disk=73.32908630371094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.236 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.236 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.322 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.323 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.323 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.360 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.376 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.395 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.395 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.519 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:53 compute-0 nova_compute[186989]: 2025-12-10 10:24:53.520 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:24:54 compute-0 nova_compute[186989]: 2025-12-10 10:24:54.013 186993 DEBUG nova.network.neutron [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated VIF entry in instance network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:24:54 compute-0 nova_compute[186989]: 2025-12-10 10:24:54.014 186993 DEBUG nova.network.neutron [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:24:54 compute-0 nova_compute[186989]: 2025-12-10 10:24:54.033 186993 DEBUG oslo_concurrency.lockutils [req-27c97dce-94eb-4f91-a911-2950d0fdcb75 req-d86a1bcd-f0ec-48cb-bcdf-47c30f8149f0 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:24:57 compute-0 podman[215910]: 2025-12-10 10:24:57.036606892 +0000 UTC m=+0.069131897 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 10 10:24:57 compute-0 podman[215909]: 2025-12-10 10:24:57.046371244 +0000 UTC m=+0.077092561 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 10 10:24:57 compute-0 podman[215911]: 2025-12-10 10:24:57.077766468 +0000 UTC m=+0.101598530 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:24:57 compute-0 nova_compute[186989]: 2025-12-10 10:24:57.514 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:57 compute-0 nova_compute[186989]: 2025-12-10 10:24:57.696 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:24:58 compute-0 ovn_controller[95452]: 2025-12-10T10:24:58Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:74:34 10.100.0.14
Dec 10 10:24:58 compute-0 ovn_controller[95452]: 2025-12-10T10:24:58Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:74:34 10.100.0.14
Dec 10 10:25:02 compute-0 nova_compute[186989]: 2025-12-10 10:25:02.516 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:02 compute-0 nova_compute[186989]: 2025-12-10 10:25:02.699 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:04 compute-0 nova_compute[186989]: 2025-12-10 10:25:04.705 186993 INFO nova.compute.manager [None req-2010395f-d9a2-4124-99fd-cb118b6bb175 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Get console output
Dec 10 10:25:05 compute-0 nova_compute[186989]: 2025-12-10 10:25:05.667 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:25:06 compute-0 podman[215973]: 2025-12-10 10:25:06.05827742 +0000 UTC m=+0.092160646 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, vcs-type=git, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 10 10:25:07 compute-0 nova_compute[186989]: 2025-12-10 10:25:07.519 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:07 compute-0 nova_compute[186989]: 2025-12-10 10:25:07.702 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:09 compute-0 podman[215994]: 2025-12-10 10:25:09.065281283 +0000 UTC m=+0.087229244 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:25:09 compute-0 nova_compute[186989]: 2025-12-10 10:25:09.813 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:09 compute-0 nova_compute[186989]: 2025-12-10 10:25:09.814 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:09 compute-0 nova_compute[186989]: 2025-12-10 10:25:09.815 186993 DEBUG nova.objects.instance [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'flavor' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:25:10 compute-0 nova_compute[186989]: 2025-12-10 10:25:10.675 186993 DEBUG nova.objects.instance [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_requests' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:25:10 compute-0 nova_compute[186989]: 2025-12-10 10:25:10.690 186993 DEBUG nova.network.neutron [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:25:10 compute-0 nova_compute[186989]: 2025-12-10 10:25:10.814 186993 DEBUG nova.policy [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:25:11 compute-0 nova_compute[186989]: 2025-12-10 10:25:11.529 186993 DEBUG nova.network.neutron [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Successfully created port: 56fa819b-df3d-49ba-a5c9-698cc74fb8aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.128 186993 DEBUG nova.network.neutron [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Successfully updated port: 56fa819b-df3d-49ba-a5c9-698cc74fb8aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.145 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.145 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.145 186993 DEBUG nova.network.neutron [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.220 186993 DEBUG nova.compute.manager [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-changed-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.221 186993 DEBUG nova.compute.manager [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing instance network info cache due to event network-changed-56fa819b-df3d-49ba-a5c9-698cc74fb8aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.221 186993 DEBUG oslo_concurrency.lockutils [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.521 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:12 compute-0 nova_compute[186989]: 2025-12-10 10:25:12.703 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.724 186993 DEBUG nova.network.neutron [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.748 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.750 186993 DEBUG oslo_concurrency.lockutils [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.751 186993 DEBUG nova.network.neutron [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing network info cache for port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.754 186993 DEBUG nova.virt.libvirt.vif [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.755 186993 DEBUG nova.network.os_vif_util [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.756 186993 DEBUG nova.network.os_vif_util [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.756 186993 DEBUG os_vif [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.757 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.758 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.758 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.768 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.769 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56fa819b-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.769 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56fa819b-df, col_values=(('external_ids', {'iface-id': '56fa819b-df3d-49ba-a5c9-698cc74fb8aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:65:10', 'vm-uuid': '77bc78a9-08a2-448f-b9c0-cfd055940b6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.7722] manager: (tap56fa819b-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.771 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.775 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.778 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.779 186993 INFO os_vif [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df')
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.780 186993 DEBUG nova.virt.libvirt.vif [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.780 186993 DEBUG nova.network.os_vif_util [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.780 186993 DEBUG nova.network.os_vif_util [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.782 186993 DEBUG nova.virt.libvirt.guest [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] attach device xml: <interface type="ethernet">
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:73:65:10"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <target dev="tap56fa819b-df"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]: </interface>
Dec 10 10:25:13 compute-0 nova_compute[186989]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Dec 10 10:25:13 compute-0 kernel: tap56fa819b-df: entered promiscuous mode
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.7943] manager: (tap56fa819b-df): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.794 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 ovn_controller[95452]: 2025-12-10T10:25:13Z|00092|binding|INFO|Claiming lport 56fa819b-df3d-49ba-a5c9-698cc74fb8aa for this chassis.
Dec 10 10:25:13 compute-0 ovn_controller[95452]: 2025-12-10T10:25:13Z|00093|binding|INFO|56fa819b-df3d-49ba-a5c9-698cc74fb8aa: Claiming fa:16:3e:73:65:10 10.100.0.21
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.806 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:65:10 10.100.0.21'], port_security=['fa:16:3e:73:65:10 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d4f5c1-67c9-4f9d-8014-0361c6ae4f32, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=56fa819b-df3d-49ba-a5c9-698cc74fb8aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.807 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa in datapath 5f4c16d5-f7c5-440e-94e4-418777bf573c bound to our chassis
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.808 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5f4c16d5-f7c5-440e-94e4-418777bf573c
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.822 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a50eef31-8d6a-4093-ab4c-f13f95bf0953]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.824 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5f4c16d5-f1 in ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:25:13 compute-0 systemd-udevd[216025]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.826 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5f4c16d5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.826 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[dc46d477-ea47-4416-8937-0a0bd6ce7553]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.829 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[77630eb2-6a90-412e-afad-1de43af25c03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.831 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 ovn_controller[95452]: 2025-12-10T10:25:13Z|00094|binding|INFO|Setting lport 56fa819b-df3d-49ba-a5c9-698cc74fb8aa ovn-installed in OVS
Dec 10 10:25:13 compute-0 ovn_controller[95452]: 2025-12-10T10:25:13Z|00095|binding|INFO|Setting lport 56fa819b-df3d-49ba-a5c9-698cc74fb8aa up in Southbound
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.838 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.8410] device (tap56fa819b-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.8415] device (tap56fa819b-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.845 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[c18731f7-7f88-48f9-a128-f269aef5e787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.860 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[abec5002-9239-42df-b872-b70040989d35]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.891 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[22605fd1-f5fc-4345-8d48-b24df3f4ecd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.896 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[2be66bb0-f9f1-4968-ba7c-f12879ba2356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.8975] manager: (tap5f4c16d5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Dec 10 10:25:13 compute-0 systemd-udevd[216028]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.903 186993 DEBUG nova.virt.libvirt.driver [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.904 186993 DEBUG nova.virt.libvirt.driver [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.904 186993 DEBUG nova.virt.libvirt.driver [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:89:74:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.904 186993 DEBUG nova.virt.libvirt.driver [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:73:65:10, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.932 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c278fff5-7874-46b1-9f9e-79741086e72f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.935 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1020dc8b-621e-4a6f-99d1-e561e58ec32b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.937 186993 DEBUG nova.virt.libvirt.guest [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:25:13</nova:creationTime>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:25:13 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     <nova:port uuid="56fa819b-df3d-49ba-a5c9-698cc74fb8aa">
Dec 10 10:25:13 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:25:13 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:25:13 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:25:13 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:25:13 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:25:13 compute-0 nova_compute[186989]: 2025-12-10 10:25:13.959 186993 DEBUG oslo_concurrency.lockutils [None req-50f2cf55-5ed1-4fa0-b918-8374b1eec3aa 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 4.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:13 compute-0 NetworkManager[55541]: <info>  [1765362313.9615] device (tap5f4c16d5-f0): carrier: link connected
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.968 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[5f479f5d-02ea-42bb-bab1-7d5d82862715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:13 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:13.987 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[2936a1e9-efd6-46d9-a832-59a3e1ae2d6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f4c16d5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a4:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332917, 'reachable_time': 31725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216051, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.005 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb4e4a3-3fb6-409f-bb4c-15bf14a11381]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:a4c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332917, 'tstamp': 332917}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216052, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.024 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6974e7-8b1f-488e-b5a3-9c93ec1facb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f4c16d5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a4:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332917, 'reachable_time': 31725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216053, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.059 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a5178f-96ae-46c7-8e47-3046bd5e4cec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.141 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5427c45d-c60b-45c3-9ffc-77c66d027049]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.143 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f4c16d5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.144 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.144 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f4c16d5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.146 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:14 compute-0 NetworkManager[55541]: <info>  [1765362314.1473] manager: (tap5f4c16d5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec 10 10:25:14 compute-0 kernel: tap5f4c16d5-f0: entered promiscuous mode
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.151 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5f4c16d5-f0, col_values=(('external_ids', {'iface-id': '11811771-dcf3-4d12-93ff-39d266ef1136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.152 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:14 compute-0 ovn_controller[95452]: 2025-12-10T10:25:14Z|00096|binding|INFO|Releasing lport 11811771-dcf3-4d12-93ff-39d266ef1136 from this chassis (sb_readonly=0)
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.169 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.170 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5f4c16d5-f7c5-440e-94e4-418777bf573c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5f4c16d5-f7c5-440e-94e4-418777bf573c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.171 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0fb49f-1c78-406d-8187-ac4f9a71c47a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.172 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-5f4c16d5-f7c5-440e-94e4-418777bf573c
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/5f4c16d5-f7c5-440e-94e4-418777bf573c.pid.haproxy
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 5f4c16d5-f7c5-440e-94e4-418777bf573c
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:25:14 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:14.173 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'env', 'PROCESS_TAG=haproxy-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5f4c16d5-f7c5-440e-94e4-418777bf573c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.297 186993 DEBUG nova.compute.manager [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.298 186993 DEBUG oslo_concurrency.lockutils [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.299 186993 DEBUG oslo_concurrency.lockutils [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.299 186993 DEBUG oslo_concurrency.lockutils [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.299 186993 DEBUG nova.compute.manager [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.300 186993 WARNING nova.compute.manager [req-cde96113-fdbf-4d87-b221-780c36a2285b req-f79a112a-584b-462a-bed1-c5fdbc15b7a5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa for instance with vm_state active and task_state None.
Dec 10 10:25:14 compute-0 podman[216085]: 2025-12-10 10:25:14.586411682 +0000 UTC m=+0.028115616 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:25:14 compute-0 podman[216085]: 2025-12-10 10:25:14.686154811 +0000 UTC m=+0.127858715 container create 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 10 10:25:14 compute-0 systemd[1]: Started libpod-conmon-327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26.scope.
Dec 10 10:25:14 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:25:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52544b67c9d2ec5baf671eaba519b386032c9d27ce926ff94d498498cd40d92e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:25:14 compute-0 podman[216085]: 2025-12-10 10:25:14.770819954 +0000 UTC m=+0.212523868 container init 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:25:14 compute-0 podman[216085]: 2025-12-10 10:25:14.776900708 +0000 UTC m=+0.218604602 container start 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:25:14 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [NOTICE]   (216104) : New worker (216106) forked
Dec 10 10:25:14 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [NOTICE]   (216104) : Loading success.
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.831 186993 DEBUG nova.network.neutron [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated VIF entry in instance network info cache for port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.832 186993 DEBUG nova.network.neutron [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:14 compute-0 nova_compute[186989]: 2025-12-10 10:25:14.849 186993 DEBUG oslo_concurrency.lockutils [req-1dba53e7-1f0e-43f5-b896-73de566bd82c req-592b121d-16d8-4551-ac57-4657e17f7808 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.396 186993 DEBUG nova.compute.manager [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.396 186993 DEBUG oslo_concurrency.lockutils [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.397 186993 DEBUG oslo_concurrency.lockutils [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.397 186993 DEBUG oslo_concurrency.lockutils [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.397 186993 DEBUG nova.compute.manager [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:25:16 compute-0 nova_compute[186989]: 2025-12-10 10:25:16.397 186993 WARNING nova.compute.manager [req-241f9533-c6fa-4c8e-8aea-c6b925b3d0b1 req-93924842-4d7f-4770-8e4f-abc02af6f4d7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa for instance with vm_state active and task_state None.
Dec 10 10:25:16 compute-0 ovn_controller[95452]: 2025-12-10T10:25:16Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:73:65:10 10.100.0.21
Dec 10 10:25:16 compute-0 ovn_controller[95452]: 2025-12-10T10:25:16Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:73:65:10 10.100.0.21
Dec 10 10:25:17 compute-0 nova_compute[186989]: 2025-12-10 10:25:17.524 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:18 compute-0 nova_compute[186989]: 2025-12-10 10:25:18.773 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:20 compute-0 podman[216115]: 2025-12-10 10:25:20.018137439 +0000 UTC m=+0.056614111 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:25:22 compute-0 podman[216139]: 2025-12-10 10:25:22.021249239 +0000 UTC m=+0.063493557 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:25:22 compute-0 nova_compute[186989]: 2025-12-10 10:25:22.572 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:23 compute-0 nova_compute[186989]: 2025-12-10 10:25:23.777 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:27 compute-0 nova_compute[186989]: 2025-12-10 10:25:27.596 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:28 compute-0 podman[216159]: 2025-12-10 10:25:28.023843649 +0000 UTC m=+0.060706161 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 10 10:25:28 compute-0 podman[216158]: 2025-12-10 10:25:28.055900331 +0000 UTC m=+0.097978963 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 10 10:25:28 compute-0 podman[216160]: 2025-12-10 10:25:28.065772536 +0000 UTC m=+0.102627937 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:25:28 compute-0 nova_compute[186989]: 2025-12-10 10:25:28.783 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.237 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.237 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.253 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.334 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.334 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.343 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.343 186993 INFO nova.compute.claims [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.531 186993 DEBUG nova.compute.provider_tree [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.549 186993 DEBUG nova.scheduler.client.report [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.573 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.574 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.632 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.632 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.655 186993 INFO nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.675 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.776 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.777 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.778 186993 INFO nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Creating image(s)
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.778 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.779 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.780 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.798 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.856 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.858 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.858 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.868 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.927 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.928 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.979 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.980 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:30 compute-0 nova_compute[186989]: 2025-12-10 10:25:30.981 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.057 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.058 186993 DEBUG nova.virt.disk.api [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.059 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.123 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.124 186993 DEBUG nova.virt.disk.api [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.125 186993 DEBUG nova.objects.instance [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid f2349666-5326-4e13-bd6a-8d6adb3613ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.141 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.142 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Ensure instance console log exists: /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.143 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.143 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.144 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:31.467 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:31.468 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:31.469 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:31 compute-0 nova_compute[186989]: 2025-12-10 10:25:31.651 186993 DEBUG nova.policy [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:25:32 compute-0 nova_compute[186989]: 2025-12-10 10:25:32.600 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:33 compute-0 nova_compute[186989]: 2025-12-10 10:25:33.160 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Successfully created port: 4f6a2c06-ec46-4119-90a8-7e67227137b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:25:33 compute-0 nova_compute[186989]: 2025-12-10 10:25:33.787 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.543 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Successfully updated port: 4f6a2c06-ec46-4119-90a8-7e67227137b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.556 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.557 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.557 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.657 186993 DEBUG nova.compute.manager [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-changed-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.659 186993 DEBUG nova.compute.manager [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Refreshing instance network info cache due to event network-changed-4f6a2c06-ec46-4119-90a8-7e67227137b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.659 186993 DEBUG oslo_concurrency.lockutils [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:34 compute-0 nova_compute[186989]: 2025-12-10 10:25:34.741 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.884 186993 DEBUG nova.network.neutron [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Updating instance_info_cache with network_info: [{"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.910 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.910 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Instance network_info: |[{"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.911 186993 DEBUG oslo_concurrency.lockutils [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.912 186993 DEBUG nova.network.neutron [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Refreshing network info cache for port 4f6a2c06-ec46-4119-90a8-7e67227137b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.915 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Start _get_guest_xml network_info=[{"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.921 186993 WARNING nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.927 186993 DEBUG nova.virt.libvirt.host [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.928 186993 DEBUG nova.virt.libvirt.host [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.938 186993 DEBUG nova.virt.libvirt.host [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.939 186993 DEBUG nova.virt.libvirt.host [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.939 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.940 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.941 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.941 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.941 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.942 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.942 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.942 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.943 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.943 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.943 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.944 186993 DEBUG nova.virt.hardware [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.950 186993 DEBUG nova.virt.libvirt.vif [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1709359727',display_name='tempest-TestNetworkBasicOps-server-1709359727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1709359727',id=7,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLRFf5DyWtKsMGcD7XtVySQIkHWqQ76xlQRhLQIAjovTuJ3R2CqeAYlxplAlLStHgCRnpSjWsr5HOYIdaMnb6IqUt5HYNmyfJaGaXlFZWwJAES3GhTJmJu4W+lxCogz9w==',key_name='tempest-TestNetworkBasicOps-33435830',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-9o4z4nxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:25:30Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=f2349666-5326-4e13-bd6a-8d6adb3613ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.950 186993 DEBUG nova.network.os_vif_util [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.951 186993 DEBUG nova.network.os_vif_util [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.952 186993 DEBUG nova.objects.instance [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2349666-5326-4e13-bd6a-8d6adb3613ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.972 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <uuid>f2349666-5326-4e13-bd6a-8d6adb3613ad</uuid>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <name>instance-00000007</name>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1709359727</nova:name>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:25:35</nova:creationTime>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         <nova:port uuid="4f6a2c06-ec46-4119-90a8-7e67227137b7">
Dec 10 10:25:35 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <system>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="serial">f2349666-5326-4e13-bd6a-8d6adb3613ad</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="uuid">f2349666-5326-4e13-bd6a-8d6adb3613ad</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </system>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <os>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </os>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <features>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </features>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.config"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:28:70:4e"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <target dev="tap4f6a2c06-ec"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/console.log" append="off"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <video>
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </video>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:25:35 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:25:35 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:25:35 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:25:35 compute-0 nova_compute[186989]: </domain>
Dec 10 10:25:35 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.974 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Preparing to wait for external event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.975 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.975 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.976 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.977 186993 DEBUG nova.virt.libvirt.vif [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1709359727',display_name='tempest-TestNetworkBasicOps-server-1709359727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1709359727',id=7,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLRFf5DyWtKsMGcD7XtVySQIkHWqQ76xlQRhLQIAjovTuJ3R2CqeAYlxplAlLStHgCRnpSjWsr5HOYIdaMnb6IqUt5HYNmyfJaGaXlFZWwJAES3GhTJmJu4W+lxCogz9w==',key_name='tempest-TestNetworkBasicOps-33435830',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-9o4z4nxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:25:30Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=f2349666-5326-4e13-bd6a-8d6adb3613ad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.977 186993 DEBUG nova.network.os_vif_util [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.978 186993 DEBUG nova.network.os_vif_util [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.978 186993 DEBUG os_vif [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.979 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.980 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.980 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.985 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.985 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f6a2c06-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.986 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f6a2c06-ec, col_values=(('external_ids', {'iface-id': '4f6a2c06-ec46-4119-90a8-7e67227137b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:70:4e', 'vm-uuid': 'f2349666-5326-4e13-bd6a-8d6adb3613ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:35 compute-0 NetworkManager[55541]: <info>  [1765362335.9893] manager: (tap4f6a2c06-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.990 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.997 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:35 compute-0 nova_compute[186989]: 2025-12-10 10:25:35.998 186993 INFO os_vif [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec')
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.365 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.366 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.366 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:28:70:4e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.367 186993 INFO nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Using config drive
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.909 186993 INFO nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Creating config drive at /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.config
Dec 10 10:25:36 compute-0 nova_compute[186989]: 2025-12-10 10:25:36.914 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgpg5tkt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.041 186993 DEBUG oslo_concurrency.processutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgpg5tkt" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:37 compute-0 podman[216242]: 2025-12-10 10:25:37.052436284 +0000 UTC m=+0.083827783 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 10 10:25:37 compute-0 kernel: tap4f6a2c06-ec: entered promiscuous mode
Dec 10 10:25:37 compute-0 NetworkManager[55541]: <info>  [1765362337.1125] manager: (tap4f6a2c06-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.115 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 ovn_controller[95452]: 2025-12-10T10:25:37Z|00097|binding|INFO|Claiming lport 4f6a2c06-ec46-4119-90a8-7e67227137b7 for this chassis.
Dec 10 10:25:37 compute-0 ovn_controller[95452]: 2025-12-10T10:25:37Z|00098|binding|INFO|4f6a2c06-ec46-4119-90a8-7e67227137b7: Claiming fa:16:3e:28:70:4e 10.100.0.18
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.125 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:70:4e 10.100.0.18'], port_security=['fa:16:3e:28:70:4e 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'f2349666-5326-4e13-bd6a-8d6adb3613ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '125cfa65-5fc5-44d8-8154-7d126b287359', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d4f5c1-67c9-4f9d-8014-0361c6ae4f32, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=4f6a2c06-ec46-4119-90a8-7e67227137b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.126 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 4f6a2c06-ec46-4119-90a8-7e67227137b7 in datapath 5f4c16d5-f7c5-440e-94e4-418777bf573c bound to our chassis
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.127 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5f4c16d5-f7c5-440e-94e4-418777bf573c
Dec 10 10:25:37 compute-0 ovn_controller[95452]: 2025-12-10T10:25:37Z|00099|binding|INFO|Setting lport 4f6a2c06-ec46-4119-90a8-7e67227137b7 ovn-installed in OVS
Dec 10 10:25:37 compute-0 ovn_controller[95452]: 2025-12-10T10:25:37Z|00100|binding|INFO|Setting lport 4f6a2c06-ec46-4119-90a8-7e67227137b7 up in Southbound
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.129 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.131 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.147 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6f0ac656-727a-4d15-971c-82ad87c149f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 systemd-udevd[216282]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:25:37 compute-0 systemd-machined[153379]: New machine qemu-7-instance-00000007.
Dec 10 10:25:37 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Dec 10 10:25:37 compute-0 NetworkManager[55541]: <info>  [1765362337.1747] device (tap4f6a2c06-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:25:37 compute-0 NetworkManager[55541]: <info>  [1765362337.1757] device (tap4f6a2c06-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.190 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[31606d0f-a4be-4764-a390-936f6d866d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.194 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[33d69553-4582-4dfb-9406-7e40e661075c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.227 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[9237b65d-c8b9-4028-9369-f2569300711f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.246 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[06ea8738-784f-4bf4-ba27-f979cdb8ea2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f4c16d5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a4:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332917, 'reachable_time': 31725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216294, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.265 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[861047ab-1952-4329-b3f6-60bd83b3624a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5f4c16d5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332930, 'tstamp': 332930}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216296, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap5f4c16d5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332935, 'tstamp': 332935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216296, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.267 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f4c16d5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.269 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.270 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.271 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f4c16d5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.271 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.271 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5f4c16d5-f0, col_values=(('external_ids', {'iface-id': '11811771-dcf3-4d12-93ff-39d266ef1136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.272 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.585 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362337.5833535, f2349666-5326-4e13-bd6a-8d6adb3613ad => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.585 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] VM Started (Lifecycle Event)
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.601 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.607 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.613 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362337.5885103, f2349666-5326-4e13-bd6a-8d6adb3613ad => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.614 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] VM Paused (Lifecycle Event)
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.632 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.636 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.646 186993 DEBUG nova.compute.manager [req-047ba360-0a48-421d-a06b-a92d4ded5aa1 req-38604ee9-cd08-481a-a14d-8eccf640d05c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.646 186993 DEBUG oslo_concurrency.lockutils [req-047ba360-0a48-421d-a06b-a92d4ded5aa1 req-38604ee9-cd08-481a-a14d-8eccf640d05c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.647 186993 DEBUG oslo_concurrency.lockutils [req-047ba360-0a48-421d-a06b-a92d4ded5aa1 req-38604ee9-cd08-481a-a14d-8eccf640d05c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.647 186993 DEBUG oslo_concurrency.lockutils [req-047ba360-0a48-421d-a06b-a92d4ded5aa1 req-38604ee9-cd08-481a-a14d-8eccf640d05c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.647 186993 DEBUG nova.compute.manager [req-047ba360-0a48-421d-a06b-a92d4ded5aa1 req-38604ee9-cd08-481a-a14d-8eccf640d05c 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Processing event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.648 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.653 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.655 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.655 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362337.6523721, f2349666-5326-4e13-bd6a-8d6adb3613ad => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.656 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] VM Resumed (Lifecycle Event)
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.658 186993 INFO nova.virt.libvirt.driver [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Instance spawned successfully.
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.659 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.680 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.686 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.690 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.690 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.691 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.691 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.692 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.692 186993 DEBUG nova.virt.libvirt.driver [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.716 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.753 186993 INFO nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Took 6.98 seconds to spawn the instance on the hypervisor.
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.753 186993 DEBUG nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.779 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.780 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:37 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:37.780 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.823 186993 INFO nova.compute.manager [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Took 7.52 seconds to build instance.
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.838 186993 DEBUG oslo_concurrency.lockutils [None req-6b69a63d-5d6a-46cc-9076-189b792f8824 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.962 186993 DEBUG nova.network.neutron [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Updated VIF entry in instance network info cache for port 4f6a2c06-ec46-4119-90a8-7e67227137b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.963 186993 DEBUG nova.network.neutron [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Updating instance_info_cache with network_info: [{"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:37 compute-0 nova_compute[186989]: 2025-12-10 10:25:37.985 186993 DEBUG oslo_concurrency.lockutils [req-bb690798-0573-443a-90f3-5b9d13273d9a req-fea84dc9-b72a-42c0-b502-e736d4350b42 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-f2349666-5326-4e13-bd6a-8d6adb3613ad" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.758 186993 DEBUG nova.compute.manager [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.758 186993 DEBUG oslo_concurrency.lockutils [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.759 186993 DEBUG oslo_concurrency.lockutils [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.759 186993 DEBUG oslo_concurrency.lockutils [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.759 186993 DEBUG nova.compute.manager [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] No waiting events found dispatching network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:25:39 compute-0 nova_compute[186989]: 2025-12-10 10:25:39.759 186993 WARNING nova.compute.manager [req-859ce9d9-1eb5-48e5-b2d6-90b4ab89e535 req-452dd456-7690-4f55-bf4b-4d0521b960b5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received unexpected event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 for instance with vm_state active and task_state None.
Dec 10 10:25:40 compute-0 podman[216304]: 2025-12-10 10:25:40.072675002 +0000 UTC m=+0.112082881 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:25:41 compute-0 nova_compute[186989]: 2025-12-10 10:25:41.024 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:42 compute-0 nova_compute[186989]: 2025-12-10 10:25:42.608 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:46 compute-0 nova_compute[186989]: 2025-12-10 10:25:46.028 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:46 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:25:46.782 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:25:46 compute-0 nova_compute[186989]: 2025-12-10 10:25:46.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:47 compute-0 nova_compute[186989]: 2025-12-10 10:25:47.612 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:48 compute-0 nova_compute[186989]: 2025-12-10 10:25:48.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:49 compute-0 ovn_controller[95452]: 2025-12-10T10:25:49Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:28:70:4e 10.100.0.18
Dec 10 10:25:49 compute-0 ovn_controller[95452]: 2025-12-10T10:25:49Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:28:70:4e 10.100.0.18
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.475 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.507 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Triggering sync for uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.508 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Triggering sync for uuid f2349666-5326-4e13-bd6a-8d6adb3613ad _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.508 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.509 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.510 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.510 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.566 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.568 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:50 compute-0 nova_compute[186989]: 2025-12-10 10:25:50.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:25:51 compute-0 podman[216355]: 2025-12-10 10:25:51.023367223 +0000 UTC m=+0.064820779 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:25:51 compute-0 nova_compute[186989]: 2025-12-10 10:25:51.031 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:51 compute-0 nova_compute[186989]: 2025-12-10 10:25:51.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:51 compute-0 nova_compute[186989]: 2025-12-10 10:25:51.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:52 compute-0 nova_compute[186989]: 2025-12-10 10:25:52.614 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:52 compute-0 nova_compute[186989]: 2025-12-10 10:25:52.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:52 compute-0 nova_compute[186989]: 2025-12-10 10:25:52.920 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:25:52 compute-0 nova_compute[186989]: 2025-12-10 10:25:52.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:25:53 compute-0 podman[216381]: 2025-12-10 10:25:53.021066203 +0000 UTC m=+0.059883725 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 10 10:25:53 compute-0 nova_compute[186989]: 2025-12-10 10:25:53.450 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:53 compute-0 nova_compute[186989]: 2025-12-10 10:25:53.451 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:53 compute-0 nova_compute[186989]: 2025-12-10 10:25:53.451 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:25:53 compute-0 nova_compute[186989]: 2025-12-10 10:25:53.451 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.590 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.615 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.615 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.615 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.616 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.616 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.642 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.643 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.643 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.643 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.726 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.798 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.799 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.894 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.901 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.963 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:55 compute-0 nova_compute[186989]: 2025-12-10 10:25:55.964 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.062 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.109 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.307 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.309 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5404MB free_disk=73.27145385742188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.309 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.310 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.467 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.468 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance f2349666-5326-4e13-bd6a-8d6adb3613ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.468 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.468 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.541 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing inventories for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.658 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating ProviderTree inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.659 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.684 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing aggregate associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.702 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing trait associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.755 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.777 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.810 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.811 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.813 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.813 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.832 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.833 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.856 186993 DEBUG nova.compute.manager [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-changed-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.856 186993 DEBUG nova.compute.manager [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing instance network info cache due to event network-changed-56fa819b-df3d-49ba-a5c9-698cc74fb8aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.856 186993 DEBUG oslo_concurrency.lockutils [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.856 186993 DEBUG oslo_concurrency.lockutils [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.857 186993 DEBUG nova.network.neutron [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing network info cache for port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.930 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.930 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:25:56 compute-0 nova_compute[186989]: 2025-12-10 10:25:56.931 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 10 10:25:57 compute-0 nova_compute[186989]: 2025-12-10 10:25:57.616 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:25:57 compute-0 nova_compute[186989]: 2025-12-10 10:25:57.877 186993 DEBUG nova.network.neutron [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated VIF entry in instance network info cache for port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:25:57 compute-0 nova_compute[186989]: 2025-12-10 10:25:57.877 186993 DEBUG nova.network.neutron [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:25:57 compute-0 nova_compute[186989]: 2025-12-10 10:25:57.901 186993 DEBUG oslo_concurrency.lockutils [req-3e0b0b9c-f3fe-4449-a2f1-378a8239a082 req-6ad002f0-d4e3-467a-a548-8a6747cbfdb9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:25:59 compute-0 podman[216413]: 2025-12-10 10:25:59.059791861 +0000 UTC m=+0.085091473 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:25:59 compute-0 podman[216414]: 2025-12-10 10:25:59.06856689 +0000 UTC m=+0.095146037 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:25:59 compute-0 podman[216415]: 2025-12-10 10:25:59.106885086 +0000 UTC m=+0.119579574 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 10 10:26:01 compute-0 nova_compute[186989]: 2025-12-10 10:26:01.113 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:02 compute-0 nova_compute[186989]: 2025-12-10 10:26:02.619 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.479 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.480 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.480 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.481 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.481 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.484 186993 INFO nova.compute.manager [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Terminating instance
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.485 186993 DEBUG nova.compute.manager [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:26:03 compute-0 kernel: tap4f6a2c06-ec (unregistering): left promiscuous mode
Dec 10 10:26:03 compute-0 NetworkManager[55541]: <info>  [1765362363.5128] device (tap4f6a2c06-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.574 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 ovn_controller[95452]: 2025-12-10T10:26:03Z|00101|binding|INFO|Releasing lport 4f6a2c06-ec46-4119-90a8-7e67227137b7 from this chassis (sb_readonly=0)
Dec 10 10:26:03 compute-0 ovn_controller[95452]: 2025-12-10T10:26:03Z|00102|binding|INFO|Setting lport 4f6a2c06-ec46-4119-90a8-7e67227137b7 down in Southbound
Dec 10 10:26:03 compute-0 ovn_controller[95452]: 2025-12-10T10:26:03Z|00103|binding|INFO|Removing iface tap4f6a2c06-ec ovn-installed in OVS
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.579 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.584 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:70:4e 10.100.0.18'], port_security=['fa:16:3e:28:70:4e 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'f2349666-5326-4e13-bd6a-8d6adb3613ad', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '125cfa65-5fc5-44d8-8154-7d126b287359', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d4f5c1-67c9-4f9d-8014-0361c6ae4f32, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=4f6a2c06-ec46-4119-90a8-7e67227137b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.586 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 4f6a2c06-ec46-4119-90a8-7e67227137b7 in datapath 5f4c16d5-f7c5-440e-94e4-418777bf573c unbound from our chassis
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.587 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5f4c16d5-f7c5-440e-94e4-418777bf573c
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.593 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.609 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7834bf-b545-4ebe-b4fb-213b51815952]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 10 10:26:03 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 12.897s CPU time.
Dec 10 10:26:03 compute-0 systemd-machined[153379]: Machine qemu-7-instance-00000007 terminated.
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.637 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[e636013e-b8d9-440a-81be-c5175d9f32ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.642 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[9444085d-b6aa-4386-a426-cdfa03582228]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.671 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0593bc-86dd-4b74-873e-e829b80877b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.689 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[900719bc-6286-4dd4-b204-978019a4677d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f4c16d5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a4:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 790, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 7, 'rx_bytes': 790, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332917, 'reachable_time': 31725, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216500, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.707 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[132e02c6-5813-4e16-ba6b-077af211dfa7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap5f4c16d5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332930, 'tstamp': 332930}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216501, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap5f4c16d5-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 332935, 'tstamp': 332935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216501, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.709 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f4c16d5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.711 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.723 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.724 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f4c16d5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.725 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.725 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5f4c16d5-f0, col_values=(('external_ids', {'iface-id': '11811771-dcf3-4d12-93ff-39d266ef1136'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:03 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:03.725 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.762 186993 INFO nova.virt.libvirt.driver [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Instance destroyed successfully.
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.762 186993 DEBUG nova.objects.instance [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid f2349666-5326-4e13-bd6a-8d6adb3613ad obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.775 186993 DEBUG nova.virt.libvirt.vif [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1709359727',display_name='tempest-TestNetworkBasicOps-server-1709359727',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1709359727',id=7,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLRFf5DyWtKsMGcD7XtVySQIkHWqQ76xlQRhLQIAjovTuJ3R2CqeAYlxplAlLStHgCRnpSjWsr5HOYIdaMnb6IqUt5HYNmyfJaGaXlFZWwJAES3GhTJmJu4W+lxCogz9w==',key_name='tempest-TestNetworkBasicOps-33435830',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:25:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-9o4z4nxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:25:37Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=f2349666-5326-4e13-bd6a-8d6adb3613ad,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.776 186993 DEBUG nova.network.os_vif_util [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "address": "fa:16:3e:28:70:4e", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f6a2c06-ec", "ovs_interfaceid": "4f6a2c06-ec46-4119-90a8-7e67227137b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.777 186993 DEBUG nova.network.os_vif_util [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.777 186993 DEBUG os_vif [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.780 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.780 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f6a2c06-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.782 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.785 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.786 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.788 186993 INFO os_vif [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:70:4e,bridge_name='br-int',has_traffic_filtering=True,id=4f6a2c06-ec46-4119-90a8-7e67227137b7,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f6a2c06-ec')
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.788 186993 INFO nova.virt.libvirt.driver [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Deleting instance files /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad_del
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.789 186993 INFO nova.virt.libvirt.driver [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Deletion of /var/lib/nova/instances/f2349666-5326-4e13-bd6a-8d6adb3613ad_del complete
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.842 186993 INFO nova.compute.manager [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.842 186993 DEBUG oslo.service.loopingcall [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.843 186993 DEBUG nova.compute.manager [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:26:03 compute-0 nova_compute[186989]: 2025-12-10 10:26:03.843 186993 DEBUG nova.network.neutron [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.654 186993 DEBUG nova.compute.manager [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-unplugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.655 186993 DEBUG oslo_concurrency.lockutils [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.655 186993 DEBUG oslo_concurrency.lockutils [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.656 186993 DEBUG oslo_concurrency.lockutils [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.656 186993 DEBUG nova.compute.manager [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] No waiting events found dispatching network-vif-unplugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:04 compute-0 nova_compute[186989]: 2025-12-10 10:26:04.656 186993 DEBUG nova.compute.manager [req-aef011ed-96ae-4e54-877f-cb75ae820a1e req-20a015a3-d5ae-4005-a7f2-ec69b4dcd44d 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-unplugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.777 186993 DEBUG nova.network.neutron [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.794 186993 INFO nova.compute.manager [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Took 1.95 seconds to deallocate network for instance.
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.837 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.837 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.922 186993 DEBUG nova.compute.provider_tree [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.939 186993 DEBUG nova.scheduler.client.report [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.965 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:05 compute-0 nova_compute[186989]: 2025-12-10 10:26:05.994 186993 INFO nova.scheduler.client.report [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance f2349666-5326-4e13-bd6a-8d6adb3613ad
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.053 186993 DEBUG oslo_concurrency.lockutils [None req-a6cfeb28-1979-4ac6-a06b-d1a2a5d964eb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.840 186993 DEBUG nova.compute.manager [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.841 186993 DEBUG oslo_concurrency.lockutils [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.842 186993 DEBUG oslo_concurrency.lockutils [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.843 186993 DEBUG oslo_concurrency.lockutils [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "f2349666-5326-4e13-bd6a-8d6adb3613ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.843 186993 DEBUG nova.compute.manager [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] No waiting events found dispatching network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.844 186993 WARNING nova.compute.manager [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received unexpected event network-vif-plugged-4f6a2c06-ec46-4119-90a8-7e67227137b7 for instance with vm_state deleted and task_state None.
Dec 10 10:26:06 compute-0 nova_compute[186989]: 2025-12-10 10:26:06.844 186993 DEBUG nova.compute.manager [req-9f28c505-c8a3-49fa-970a-9cb8a9552e8f req-2a5be1d0-d3af-4f77-b848-825ae57d2abf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Received event network-vif-deleted-4f6a2c06-ec46-4119-90a8-7e67227137b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:07 compute-0 nova_compute[186989]: 2025-12-10 10:26:07.621 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 podman[216518]: 2025-12-10 10:26:08.060555099 +0000 UTC m=+0.093677517 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm)
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.704 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-56fa819b-df3d-49ba-a5c9-698cc74fb8aa" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.705 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-56fa819b-df3d-49ba-a5c9-698cc74fb8aa" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.730 186993 DEBUG nova.objects.instance [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'flavor' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.758 186993 DEBUG nova.virt.libvirt.vif [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.759 186993 DEBUG nova.network.os_vif_util [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.760 186993 DEBUG nova.network.os_vif_util [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.765 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.769 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.773 186993 DEBUG nova.virt.libvirt.driver [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Attempting to detach device tap56fa819b-df from instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.773 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] detach device xml: <interface type="ethernet">
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:73:65:10"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <target dev="tap56fa819b-df"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </interface>
Dec 10 10:26:08 compute-0 nova_compute[186989]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.782 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.785 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.789 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <name>instance-00000006</name>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <uuid>77bc78a9-08a2-448f-b9c0-cfd055940b6b</uuid>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:25:13</nova:creationTime>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:port uuid="56fa819b-df3d-49ba-a5c9-698cc74fb8aa">
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='serial'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='uuid'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk' index='2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config' index='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:89:74:34'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='tap507bf448-94'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:73:65:10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='tap56fa819b-df'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='net1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       </target>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </console>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c611,c704</label>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c611,c704</imagelabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:08 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.789 186993 INFO nova.virt.libvirt.driver [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully detached device tap56fa819b-df from instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b from the persistent domain config.
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.790 186993 DEBUG nova.virt.libvirt.driver [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] (1/8): Attempting to detach device tap56fa819b-df with device alias net1 from instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.790 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] detach device xml: <interface type="ethernet">
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <mac address="fa:16:3e:73:65:10"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <model type="virtio"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <mtu size="1442"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <target dev="tap56fa819b-df"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </interface>
Dec 10 10:26:08 compute-0 nova_compute[186989]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Dec 10 10:26:08 compute-0 kernel: tap56fa819b-df (unregistering): left promiscuous mode
Dec 10 10:26:08 compute-0 NetworkManager[55541]: <info>  [1765362368.8862] device (tap56fa819b-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.894 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 ovn_controller[95452]: 2025-12-10T10:26:08Z|00104|binding|INFO|Releasing lport 56fa819b-df3d-49ba-a5c9-698cc74fb8aa from this chassis (sb_readonly=0)
Dec 10 10:26:08 compute-0 ovn_controller[95452]: 2025-12-10T10:26:08Z|00105|binding|INFO|Setting lport 56fa819b-df3d-49ba-a5c9-698cc74fb8aa down in Southbound
Dec 10 10:26:08 compute-0 ovn_controller[95452]: 2025-12-10T10:26:08Z|00106|binding|INFO|Removing iface tap56fa819b-df ovn-installed in OVS
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.897 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.900 186993 DEBUG nova.virt.libvirt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Received event <DeviceRemovedEvent: 1765362368.8998878, 77bc78a9-08a2-448f-b9c0-cfd055940b6b => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.901 186993 DEBUG nova.virt.libvirt.driver [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Start waiting for the detach event from libvirt for device tap56fa819b-df with device alias net1 for instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.902 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.905 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <name>instance-00000006</name>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <uuid>77bc78a9-08a2-448f-b9c0-cfd055940b6b</uuid>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:25:13</nova:creationTime>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:port uuid="56fa819b-df3d-49ba-a5c9-698cc74fb8aa">
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='serial'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='uuid'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk' index='2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config' index='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:26:08 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:08.906 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:65:10 10.100.0.21', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d4f5c1-67c9-4f9d-8014-0361c6ae4f32, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=56fa819b-df3d-49ba-a5c9-698cc74fb8aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:89:74:34'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target dev='tap507bf448-94'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       </target>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </console>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c611,c704</label>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c611,c704</imagelabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:08 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.906 186993 INFO nova.virt.libvirt.driver [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully detached device tap56fa819b-df from instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b from the live domain config.
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.907 186993 DEBUG nova.virt.libvirt.vif [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.907 186993 DEBUG nova.network.os_vif_util [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.908 186993 DEBUG nova.network.os_vif_util [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.908 186993 DEBUG os_vif [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:26:08 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:08.908 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa in datapath 5f4c16d5-f7c5-440e-94e4-418777bf573c unbound from our chassis
Dec 10 10:26:08 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:08.909 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f4c16d5-f7c5-440e-94e4-418777bf573c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.910 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.911 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56fa819b-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:08 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:08.910 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[18a4f21d-ec62-4764-bc3b-5fa3397ce5fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:08 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:08.910 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c namespace which is not needed anymore
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.911 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.912 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.913 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.915 186993 INFO os_vif [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df')
Dec 10 10:26:08 compute-0 nova_compute[186989]: 2025-12-10 10:26:08.916 186993 DEBUG nova.virt.libvirt.guest [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:26:08</nova:creationTime>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:08 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:08 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:08 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:08 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:08 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [NOTICE]   (216104) : haproxy version is 2.8.14-c23fe91
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [NOTICE]   (216104) : path to executable is /usr/sbin/haproxy
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [WARNING]  (216104) : Exiting Master process...
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [WARNING]  (216104) : Exiting Master process...
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [ALERT]    (216104) : Current worker (216106) exited with code 143 (Terminated)
Dec 10 10:26:09 compute-0 neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c[216100]: [WARNING]  (216104) : All workers exited. Exiting... (0)
Dec 10 10:26:09 compute-0 systemd[1]: libpod-327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26.scope: Deactivated successfully.
Dec 10 10:26:09 compute-0 podman[216562]: 2025-12-10 10:26:09.090373133 +0000 UTC m=+0.065853297 container died 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 10 10:26:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26-userdata-shm.mount: Deactivated successfully.
Dec 10 10:26:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-52544b67c9d2ec5baf671eaba519b386032c9d27ce926ff94d498498cd40d92e-merged.mount: Deactivated successfully.
Dec 10 10:26:09 compute-0 podman[216562]: 2025-12-10 10:26:09.127736443 +0000 UTC m=+0.103216607 container cleanup 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 10 10:26:09 compute-0 systemd[1]: libpod-conmon-327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26.scope: Deactivated successfully.
Dec 10 10:26:09 compute-0 podman[216589]: 2025-12-10 10:26:09.195486391 +0000 UTC m=+0.046791447 container remove 327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.201 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ae25df41-b561-4175-841a-b34255b845b0]: (4, ('Wed Dec 10 10:26:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c (327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26)\n327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26\nWed Dec 10 10:26:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c (327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26)\n327041d154fdc1931ba4d11ac01e244569b16d56eeee683c28b0dca05454df26\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.204 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[680f55db-7582-44d9-933e-6413e8028ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.205 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f4c16d5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:09 compute-0 kernel: tap5f4c16d5-f0: left promiscuous mode
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.207 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.219 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.223 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa7c3b4-4a83-4cdc-b65b-55b79b0bbd96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.237 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ff5e9e8f-4e22-4751-ba02-f9d8497a0342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.238 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad5c3d5-2bd4-46ba-affc-2c76695269e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.256 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[94d5a61c-ce9a-4f9b-ab5c-fea6be048f56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 332910, 'reachable_time': 28974, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216604, 'error': None, 'target': 'ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.259 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5f4c16d5-f7c5-440e-94e4-418777bf573c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:26:09 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:09.260 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[4565f776-7720-4def-94cd-953749370270]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:09 compute-0 systemd[1]: run-netns-ovnmeta\x2d5f4c16d5\x2df7c5\x2d440e\x2d94e4\x2d418777bf573c.mount: Deactivated successfully.
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.614 186993 DEBUG nova.compute.manager [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-unplugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.614 186993 DEBUG oslo_concurrency.lockutils [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.615 186993 DEBUG oslo_concurrency.lockutils [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.615 186993 DEBUG oslo_concurrency.lockutils [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.615 186993 DEBUG nova.compute.manager [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-unplugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.615 186993 WARNING nova.compute.manager [req-ac4fa8ea-b9f7-43fd-b10d-11f293d17aae req-175d37bd-bde1-4841-a18b-edff0b007e07 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-unplugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa for instance with vm_state active and task_state None.
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.840 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.841 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.841 186993 DEBUG nova.network.neutron [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.880 186993 DEBUG nova.compute.manager [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-deleted-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.880 186993 INFO nova.compute.manager [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Neutron deleted interface 56fa819b-df3d-49ba-a5c9-698cc74fb8aa; detaching it from the instance and deleting it from the info cache
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.881 186993 DEBUG nova.network.neutron [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.901 186993 DEBUG nova.objects.instance [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lazy-loading 'system_metadata' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.934 186993 DEBUG nova.objects.instance [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lazy-loading 'flavor' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.956 186993 DEBUG nova.virt.libvirt.vif [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.957 186993 DEBUG nova.network.os_vif_util [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.958 186993 DEBUG nova.network.os_vif_util [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.964 186993 DEBUG nova.virt.libvirt.guest [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.969 186993 DEBUG nova.virt.libvirt.guest [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <name>instance-00000006</name>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <uuid>77bc78a9-08a2-448f-b9c0-cfd055940b6b</uuid>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:26:08</nova:creationTime>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='serial'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='uuid'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk' index='2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config' index='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:89:74:34'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='tap507bf448-94'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       </target>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </console>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c611,c704</label>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c611,c704</imagelabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:09 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.970 186993 DEBUG nova.virt.libvirt.guest [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.976 186993 DEBUG nova.virt.libvirt.guest [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:73:65:10"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap56fa819b-df"/></interface>not found in domain: <domain type='kvm' id='6'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <name>instance-00000006</name>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <uuid>77bc78a9-08a2-448f-b9c0-cfd055940b6b</uuid>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:26:08</nova:creationTime>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <memory unit='KiB'>131072</memory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <currentMemory unit='KiB'>131072</currentMemory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <vcpu placement='static'>1</vcpu>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <resource>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <partition>/machine</partition>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </resource>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <sysinfo type='smbios'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='manufacturer'>RDO</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='product'>OpenStack Compute</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='serial'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='uuid'>77bc78a9-08a2-448f-b9c0-cfd055940b6b</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <entry name='family'>Virtual Machine</entry>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <boot dev='hd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <smbios mode='sysinfo'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <vmcoreinfo state='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <cpu mode='custom' match='exact' check='full'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <model fallback='forbid'>EPYC-Rome</model>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <vendor>AMD</vendor>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='x2apic'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc-deadline'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='hypervisor'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='tsc_adjust'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='spec-ctrl'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='stibp'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='cmp_legacy'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='overflow-recov'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='succor'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='ibrs'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='amd-ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='virt-ssbd'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='lbrv'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='tsc-scale'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='vmcb-clean'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='flushbyasid'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='pause-filter'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='pfthreshold'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='svme-addr-chk'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='lfence-always-serializing'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='xsaves'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='svm'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='require' name='topoext'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='npt'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <feature policy='disable' name='nrip-save'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <clock offset='utc'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='pit' tickpolicy='delay'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='rtc' tickpolicy='catchup'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <timer name='hpet' present='no'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_poweroff>destroy</on_poweroff>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_reboot>restart</on_reboot>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <on_crash>destroy</on_crash>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <disk type='file' device='disk'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='qemu' type='qcow2' cache='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk' index='2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backingStore type='file' index='3'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <format type='raw'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <source file='/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <backingStore/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       </backingStore>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='vda' bus='virtio'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='virtio-disk0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <disk type='file' device='cdrom'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='qemu' type='raw' cache='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/disk.config' index='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backingStore/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='sda' bus='sata'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <readonly/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='sata0-0-0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='0' model='pcie-root'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pcie.0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='1' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='1' port='0x10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='2' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='2' port='0x11'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='3' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='3' port='0x12'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='4' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='4' port='0x13'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='5' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='5' port='0x14'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='6' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='6' port='0x15'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='7' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='7' port='0x16'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='8' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='8' port='0x17'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.8'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='9' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='9' port='0x18'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.9'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='10' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='10' port='0x19'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='11' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='11' port='0x1a'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.11'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='12' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='12' port='0x1b'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.12'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='13' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='13' port='0x1c'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.13'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='14' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='14' port='0x1d'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.14'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='15' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='15' port='0x1e'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.15'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='16' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='16' port='0x1f'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.16'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='17' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='17' port='0x20'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.17'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='18' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='18' port='0x21'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.18'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='19' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='19' port='0x22'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.19'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='20' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='20' port='0x23'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.20'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='21' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='21' port='0x24'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.21'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='22' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='22' port='0x25'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.22'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='23' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='23' port='0x26'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.23'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='24' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='24' port='0x27'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.24'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='25' model='pcie-root-port'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-root-port'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target chassis='25' port='0x28'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.25'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model name='pcie-pci-bridge'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='pci.26'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='usb' index='0' model='piix3-uhci'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='usb'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <controller type='sata' index='0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='ide'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </controller>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <interface type='ethernet'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <mac address='fa:16:3e:89:74:34'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target dev='tap507bf448-94'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model type='virtio'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <driver name='vhost' rx_queue_size='512'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <mtu size='1442'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='net0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <serial type='pty'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target type='isa-serial' port='0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:         <model name='isa-serial'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       </target>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <console type='pty' tty='/dev/pts/0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <source path='/dev/pts/0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <log file='/var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b/console.log' append='off'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <target type='serial' port='0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='serial0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </console>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='tablet' bus='usb'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='usb' bus='0' port='1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='mouse' bus='ps2'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input1'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <input type='keyboard' bus='ps2'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='input2'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </input>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <listen type='address' address='::0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </graphics>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <audio id='1' type='none'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <model type='virtio' heads='1' primary='yes'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='video0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <watchdog model='itco' action='reset'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='watchdog0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </watchdog>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <memballoon model='virtio'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <stats period='10'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='balloon0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <rng model='virtio'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <backend model='random'>/dev/urandom</backend>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <alias name='rng0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <label>system_u:system_r:svirt_t:s0:c611,c704</label>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c611,c704</imagelabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <label>+107:+107</label>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <imagelabel>+107:+107</imagelabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </seclabel>
Dec 10 10:26:09 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:09 compute-0 nova_compute[186989]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.976 186993 WARNING nova.virt.libvirt.driver [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Detaching interface fa:16:3e:73:65:10 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap56fa819b-df' not found.
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.977 186993 DEBUG nova.virt.libvirt.vif [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.978 186993 DEBUG nova.network.os_vif_util [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converting VIF {"id": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "address": "fa:16:3e:73:65:10", "network": {"id": "5f4c16d5-f7c5-440e-94e4-418777bf573c", "bridge": "br-int", "label": "tempest-network-smoke--1605109093", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fa819b-df", "ovs_interfaceid": "56fa819b-df3d-49ba-a5c9-698cc74fb8aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.979 186993 DEBUG nova.network.os_vif_util [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.980 186993 DEBUG os_vif [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.982 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.982 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56fa819b-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.983 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.986 186993 INFO os_vif [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:65:10,bridge_name='br-int',has_traffic_filtering=True,id=56fa819b-df3d-49ba-a5c9-698cc74fb8aa,network=Network(5f4c16d5-f7c5-440e-94e4-418777bf573c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fa819b-df')
Dec 10 10:26:09 compute-0 nova_compute[186989]: 2025-12-10 10:26:09.987 186993 DEBUG nova.virt.libvirt.guest [req-1527777b-8bee-4124-bcca-38a28e564af9 req-296eee88-1287-4a17-9948-d47302ac8882 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:name>tempest-TestNetworkBasicOps-server-1350364126</nova:name>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:creationTime>2025-12-10 10:26:09</nova:creationTime>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:flavor name="m1.nano">
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:memory>128</nova:memory>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:disk>1</nova:disk>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:swap>0</nova:swap>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:flavor>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:owner>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   <nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     <nova:port uuid="507bf448-94f2-4c23-86a4-a13b31717ff8">
Dec 10 10:26:09 compute-0 nova_compute[186989]:       <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 10 10:26:09 compute-0 nova_compute[186989]:     </nova:port>
Dec 10 10:26:09 compute-0 nova_compute[186989]:   </nova:ports>
Dec 10 10:26:09 compute-0 nova_compute[186989]: </nova:instance>
Dec 10 10:26:09 compute-0 nova_compute[186989]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Dec 10 10:26:10 compute-0 ovn_controller[95452]: 2025-12-10T10:26:10Z|00107|binding|INFO|Releasing lport d89a5400-4042-4e9f-87fc-cd18de8a733b from this chassis (sb_readonly=0)
Dec 10 10:26:10 compute-0 nova_compute[186989]: 2025-12-10 10:26:10.539 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:10 compute-0 nova_compute[186989]: 2025-12-10 10:26:10.965 186993 INFO nova.network.neutron [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Port 56fa819b-df3d-49ba-a5c9-698cc74fb8aa from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 10 10:26:10 compute-0 nova_compute[186989]: 2025-12-10 10:26:10.966 186993 DEBUG nova.network.neutron [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:10 compute-0 nova_compute[186989]: 2025-12-10 10:26:10.985 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.009 186993 DEBUG oslo_concurrency.lockutils [None req-ae1a3db1-90c3-406c-a677-9680ca8d06f0 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "interface-77bc78a9-08a2-448f-b9c0-cfd055940b6b-56fa819b-df3d-49ba-a5c9-698cc74fb8aa" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:11 compute-0 podman[216605]: 2025-12-10 10:26:11.071143972 +0000 UTC m=+0.095764384 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.163 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.164 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.164 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.165 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.165 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.167 186993 INFO nova.compute.manager [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Terminating instance
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.169 186993 DEBUG nova.compute.manager [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:26:11 compute-0 kernel: tap507bf448-94 (unregistering): left promiscuous mode
Dec 10 10:26:11 compute-0 NetworkManager[55541]: <info>  [1765362371.2014] device (tap507bf448-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:26:11 compute-0 ovn_controller[95452]: 2025-12-10T10:26:11Z|00108|binding|INFO|Releasing lport 507bf448-94f2-4c23-86a4-a13b31717ff8 from this chassis (sb_readonly=0)
Dec 10 10:26:11 compute-0 ovn_controller[95452]: 2025-12-10T10:26:11Z|00109|binding|INFO|Setting lport 507bf448-94f2-4c23-86a4-a13b31717ff8 down in Southbound
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.206 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 ovn_controller[95452]: 2025-12-10T10:26:11Z|00110|binding|INFO|Removing iface tap507bf448-94 ovn-installed in OVS
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.208 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.215 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:74:34 10.100.0.14'], port_security=['fa:16:3e:89:74:34 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '77bc78a9-08a2-448f-b9c0-cfd055940b6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16c8959b-0f9c-462b-981f-7320145346f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '94e10544-6f2b-462a-accb-9b6e66b1904b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd1c942b-2467-4df4-bbeb-865ba1260aad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=507bf448-94f2-4c23-86a4-a13b31717ff8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.216 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 507bf448-94f2-4c23-86a4-a13b31717ff8 in datapath 16c8959b-0f9c-462b-981f-7320145346f8 unbound from our chassis
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.216 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16c8959b-0f9c-462b-981f-7320145346f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.217 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8f76ad-7ec8-4223-a719-825a90a088e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.218 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8 namespace which is not needed anymore
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.236 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 10 10:26:11 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 16.900s CPU time.
Dec 10 10:26:11 compute-0 systemd-machined[153379]: Machine qemu-6-instance-00000006 terminated.
Dec 10 10:26:11 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [NOTICE]   (215833) : haproxy version is 2.8.14-c23fe91
Dec 10 10:26:11 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [NOTICE]   (215833) : path to executable is /usr/sbin/haproxy
Dec 10 10:26:11 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [WARNING]  (215833) : Exiting Master process...
Dec 10 10:26:11 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [ALERT]    (215833) : Current worker (215835) exited with code 143 (Terminated)
Dec 10 10:26:11 compute-0 neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8[215828]: [WARNING]  (215833) : All workers exited. Exiting... (0)
Dec 10 10:26:11 compute-0 systemd[1]: libpod-5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf.scope: Deactivated successfully.
Dec 10 10:26:11 compute-0 conmon[215828]: conmon 5ad1bca34551d348ef3d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf.scope/container/memory.events
Dec 10 10:26:11 compute-0 podman[216651]: 2025-12-10 10:26:11.359590222 +0000 UTC m=+0.047695303 container died 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 10 10:26:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf-userdata-shm.mount: Deactivated successfully.
Dec 10 10:26:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-781f5217a91584c29cfb7658b131453b156889bdc34ea22561100e71d0940656-merged.mount: Deactivated successfully.
Dec 10 10:26:11 compute-0 NetworkManager[55541]: <info>  [1765362371.3936] manager: (tap507bf448-94): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Dec 10 10:26:11 compute-0 podman[216651]: 2025-12-10 10:26:11.399597963 +0000 UTC m=+0.087703044 container cleanup 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:26:11 compute-0 systemd[1]: libpod-conmon-5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf.scope: Deactivated successfully.
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.430 186993 INFO nova.virt.libvirt.driver [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Instance destroyed successfully.
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.432 186993 DEBUG nova.objects.instance [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 77bc78a9-08a2-448f-b9c0-cfd055940b6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.445 186993 DEBUG nova.virt.libvirt.vif [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:24:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1350364126',display_name='tempest-TestNetworkBasicOps-server-1350364126',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1350364126',id=6,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJq75Cwf3fE3Eo6aSvVEw6ZmFrXTJMC7KUQtffEBX2EhKh3zXojN07EirD/YNtNzowas01LwSdkjT048U0kK1Pkd1upNeKr0R9xHgP3GlO+3xbjcu8vRl65sDom+kt9XeQ==',key_name='tempest-TestNetworkBasicOps-368979134',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:24:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-b5az00dp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:24:45Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=77bc78a9-08a2-448f-b9c0-cfd055940b6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.445 186993 DEBUG nova.network.os_vif_util [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.446 186993 DEBUG nova.network.os_vif_util [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.446 186993 DEBUG os_vif [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.448 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.448 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap507bf448-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.449 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.451 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.454 186993 INFO os_vif [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:74:34,bridge_name='br-int',has_traffic_filtering=True,id=507bf448-94f2-4c23-86a4-a13b31717ff8,network=Network(16c8959b-0f9c-462b-981f-7320145346f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap507bf448-94')
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.455 186993 INFO nova.virt.libvirt.driver [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Deleting instance files /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b_del
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.455 186993 INFO nova.virt.libvirt.driver [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Deletion of /var/lib/nova/instances/77bc78a9-08a2-448f-b9c0-cfd055940b6b_del complete
Dec 10 10:26:11 compute-0 podman[216690]: 2025-12-10 10:26:11.479638877 +0000 UTC m=+0.053081739 container remove 5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.489 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f8570ea7-df90-4d02-b3d6-a5a9a425f4f7]: (4, ('Wed Dec 10 10:26:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8 (5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf)\n5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf\nWed Dec 10 10:26:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8 (5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf)\n5ad1bca34551d348ef3d9633a17ab3749aaf902d46ce12c5084a6dd6e2471ebf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.491 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[04ecebf4-0501-4e4d-9e8a-3cc1a5969ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.492 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16c8959b-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.493 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 kernel: tap16c8959b-00: left promiscuous mode
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.511 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.513 186993 INFO nova.compute.manager [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Took 0.34 seconds to destroy the instance on the hypervisor.
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.513 186993 DEBUG oslo.service.loopingcall [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.513 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[e30b37ef-5c31-457f-a86c-cf8021436bc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.514 186993 DEBUG nova.compute.manager [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.514 186993 DEBUG nova.network.neutron [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.528 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[b6f9ecee-841d-455c-ac4e-4da11a52d6dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.529 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d2049185-294e-4421-8c8c-c9e28321957f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.543 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a53f03-70ce-4df3-b63f-5e2d0a16126e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 329993, 'reachable_time': 25039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216711, 'error': None, 'target': 'ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.545 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16c8959b-0f9c-462b-981f-7320145346f8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:26:11 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:11.545 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[41e886e2-5856-4cf7-ab64-4d2f1acda1c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:11 compute-0 systemd[1]: run-netns-ovnmeta\x2d16c8959b\x2d0f9c\x2d462b\x2d981f\x2d7320145346f8.mount: Deactivated successfully.
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.709 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 WARNING nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-plugged-56fa819b-df3d-49ba-a5c9-698cc74fb8aa for instance with vm_state active and task_state deleting.
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.710 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.711 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing instance network info cache due to event network-changed-507bf448-94f2-4c23-86a4-a13b31717ff8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.711 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.711 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:11 compute-0 nova_compute[186989]: 2025-12-10 10:26:11.711 186993 DEBUG nova.network.neutron [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Refreshing network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.486 186993 DEBUG nova.network.neutron [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.503 186993 INFO nova.compute.manager [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Took 0.99 seconds to deallocate network for instance.
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.540 186993 DEBUG nova.compute.manager [req-50021b3f-2061-403a-9627-dc151834a88b req-27f0fbf1-d160-45a0-8312-12aa31ff6e23 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-deleted-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.623 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.698 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.699 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.770 186993 DEBUG nova.compute.provider_tree [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.787 186993 DEBUG nova.scheduler.client.report [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.815 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.850 186993 INFO nova.scheduler.client.report [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 77bc78a9-08a2-448f-b9c0-cfd055940b6b
Dec 10 10:26:12 compute-0 nova_compute[186989]: 2025-12-10 10:26:12.915 186993 DEBUG oslo_concurrency.lockutils [None req-1e3a54f7-9f77-4090-9b9d-3992f860e340 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.616 186993 DEBUG nova.network.neutron [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updated VIF entry in instance network info cache for port 507bf448-94f2-4c23-86a4-a13b31717ff8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.617 186993 DEBUG nova.network.neutron [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Updating instance_info_cache with network_info: [{"id": "507bf448-94f2-4c23-86a4-a13b31717ff8", "address": "fa:16:3e:89:74:34", "network": {"id": "16c8959b-0f9c-462b-981f-7320145346f8", "bridge": "br-int", "label": "tempest-network-smoke--169205004", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap507bf448-94", "ovs_interfaceid": "507bf448-94f2-4c23-86a4-a13b31717ff8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.633 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-77bc78a9-08a2-448f-b9c0-cfd055940b6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.633 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-unplugged-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.633 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.633 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.634 186993 DEBUG oslo_concurrency.lockutils [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.634 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-unplugged-507bf448-94f2-4c23-86a4-a13b31717ff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.634 186993 DEBUG nova.compute.manager [req-cbc418a9-dc51-41d9-8613-becf3ead459b req-abce2eba-6b3b-4cfe-ba3a-1bb7451c5135 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-unplugged-507bf448-94f2-4c23-86a4-a13b31717ff8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.815 186993 DEBUG nova.compute.manager [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.815 186993 DEBUG oslo_concurrency.lockutils [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.816 186993 DEBUG oslo_concurrency.lockutils [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.816 186993 DEBUG oslo_concurrency.lockutils [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "77bc78a9-08a2-448f-b9c0-cfd055940b6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.816 186993 DEBUG nova.compute.manager [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] No waiting events found dispatching network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:13 compute-0 nova_compute[186989]: 2025-12-10 10:26:13.816 186993 WARNING nova.compute.manager [req-c3adf636-48d1-440c-a26a-e4076f8ec502 req-fc246e21-0d8e-4e8e-a425-a95c18780451 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Received unexpected event network-vif-plugged-507bf448-94f2-4c23-86a4-a13b31717ff8 for instance with vm_state deleted and task_state None.
Dec 10 10:26:16 compute-0 nova_compute[186989]: 2025-12-10 10:26:16.451 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:17 compute-0 nova_compute[186989]: 2025-12-10 10:26:17.003 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:17 compute-0 nova_compute[186989]: 2025-12-10 10:26:17.106 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:17 compute-0 nova_compute[186989]: 2025-12-10 10:26:17.625 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:18 compute-0 nova_compute[186989]: 2025-12-10 10:26:18.760 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362363.7585995, f2349666-5326-4e13-bd6a-8d6adb3613ad => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:18 compute-0 nova_compute[186989]: 2025-12-10 10:26:18.761 186993 INFO nova.compute.manager [-] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] VM Stopped (Lifecycle Event)
Dec 10 10:26:18 compute-0 nova_compute[186989]: 2025-12-10 10:26:18.781 186993 DEBUG nova.compute.manager [None req-49b0e1b6-e1d1-44c8-9271-2e19e9119b80 - - - - - -] [instance: f2349666-5326-4e13-bd6a-8d6adb3613ad] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:21 compute-0 nova_compute[186989]: 2025-12-10 10:26:21.499 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:22 compute-0 podman[216713]: 2025-12-10 10:26:22.041876933 +0000 UTC m=+0.080991421 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:26:22 compute-0 nova_compute[186989]: 2025-12-10 10:26:22.627 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:24 compute-0 podman[216738]: 2025-12-10 10:26:24.042409821 +0000 UTC m=+0.078458511 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 10 10:26:26 compute-0 nova_compute[186989]: 2025-12-10 10:26:26.430 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362371.428193, 77bc78a9-08a2-448f-b9c0-cfd055940b6b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:26 compute-0 nova_compute[186989]: 2025-12-10 10:26:26.431 186993 INFO nova.compute.manager [-] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] VM Stopped (Lifecycle Event)
Dec 10 10:26:26 compute-0 nova_compute[186989]: 2025-12-10 10:26:26.459 186993 DEBUG nova.compute.manager [None req-c64211a9-9a4f-4736-a7ac-a14a6db2c760 - - - - - -] [instance: 77bc78a9-08a2-448f-b9c0-cfd055940b6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:26 compute-0 nova_compute[186989]: 2025-12-10 10:26:26.503 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:27 compute-0 nova_compute[186989]: 2025-12-10 10:26:27.631 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:30 compute-0 podman[216758]: 2025-12-10 10:26:30.030870895 +0000 UTC m=+0.068227283 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 10 10:26:30 compute-0 podman[216759]: 2025-12-10 10:26:30.038493833 +0000 UTC m=+0.070579467 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:26:30 compute-0 podman[216760]: 2025-12-10 10:26:30.094201314 +0000 UTC m=+0.125318071 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:26:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:31.469 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:31.470 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:31.470 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:31 compute-0 nova_compute[186989]: 2025-12-10 10:26:31.506 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:32 compute-0 nova_compute[186989]: 2025-12-10 10:26:32.634 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.162 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.163 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.178 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.258 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.258 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.269 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.269 186993 INFO nova.compute.claims [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.369 186993 DEBUG nova.compute.provider_tree [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.388 186993 DEBUG nova.scheduler.client.report [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.436 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.437 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.502 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.503 186993 DEBUG nova.network.neutron [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.525 186993 INFO nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.655 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.755 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.758 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.758 186993 INFO nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Creating image(s)
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.760 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.760 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.762 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.786 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.855 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.856 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.857 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.883 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.948 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.949 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.990 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.992 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:35 compute-0 nova_compute[186989]: 2025-12-10 10:26:35.992 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.052 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.054 186993 DEBUG nova.virt.disk.api [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.055 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.126 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.127 186993 DEBUG nova.virt.disk.api [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.128 186993 DEBUG nova.objects.instance [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid e101f1cf-8cb1-4383-b245-9f89d838a2e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.145 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.146 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Ensure instance console log exists: /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.147 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.148 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.148 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.511 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:36 compute-0 nova_compute[186989]: 2025-12-10 10:26:36.605 186993 DEBUG nova.policy [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:26:37 compute-0 nova_compute[186989]: 2025-12-10 10:26:37.639 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.701 186993 DEBUG nova.network.neutron [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Successfully updated port: 23c0526d-3a07-4045-b515-64f7987d3bec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.718 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.718 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.719 186993 DEBUG nova.network.neutron [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.795 186993 DEBUG nova.compute.manager [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.795 186993 DEBUG nova.compute.manager [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Refreshing instance network info cache due to event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.795 186993 DEBUG oslo_concurrency.lockutils [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:38 compute-0 nova_compute[186989]: 2025-12-10 10:26:38.865 186993 DEBUG nova.network.neutron [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:26:39 compute-0 podman[216837]: 2025-12-10 10:26:39.036152419 +0000 UTC m=+0.079134871 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.490 186993 DEBUG nova.network.neutron [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updating instance_info_cache with network_info: [{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.516 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.517 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Instance network_info: |[{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.518 186993 DEBUG oslo_concurrency.lockutils [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.519 186993 DEBUG nova.network.neutron [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Refreshing network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.524 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Start _get_guest_xml network_info=[{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.531 186993 WARNING nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.537 186993 DEBUG nova.virt.libvirt.host [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.538 186993 DEBUG nova.virt.libvirt.host [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.548 186993 DEBUG nova.virt.libvirt.host [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.549 186993 DEBUG nova.virt.libvirt.host [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.550 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.550 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.551 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.551 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.552 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.552 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.553 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.553 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.554 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.554 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.555 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.555 186993 DEBUG nova.virt.hardware [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.562 186993 DEBUG nova.virt.libvirt.vif [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:26:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1905495063',display_name='tempest-TestNetworkBasicOps-server-1905495063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1905495063',id=8,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIdI7taRm6qPCce+sluv977emQQ4z7gCph/DR0pvg04/aHJwyMIEVoRnz5wLU0SRwxdOqJC5ddydVdW88lYaY1o+leeVFXjXF/wotGBrph04WmrDOtEw8bry3w2Y67XcbA==',key_name='tempest-TestNetworkBasicOps-426589612',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-nseso6md',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:26:35Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e101f1cf-8cb1-4383-b245-9f89d838a2e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.562 186993 DEBUG nova.network.os_vif_util [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.563 186993 DEBUG nova.network.os_vif_util [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.565 186993 DEBUG nova.objects.instance [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid e101f1cf-8cb1-4383-b245-9f89d838a2e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.583 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <uuid>e101f1cf-8cb1-4383-b245-9f89d838a2e5</uuid>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <name>instance-00000008</name>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1905495063</nova:name>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:26:40</nova:creationTime>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         <nova:port uuid="23c0526d-3a07-4045-b515-64f7987d3bec">
Dec 10 10:26:40 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="serial">e101f1cf-8cb1-4383-b245-9f89d838a2e5</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="uuid">e101f1cf-8cb1-4383-b245-9f89d838a2e5</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.config"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:7b:60:01"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <target dev="tap23c0526d-3a"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/console.log" append="off"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:26:40 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:26:40 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:40 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:40 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:40 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.586 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Preparing to wait for external event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.586 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.587 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.587 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.588 186993 DEBUG nova.virt.libvirt.vif [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:26:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1905495063',display_name='tempest-TestNetworkBasicOps-server-1905495063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1905495063',id=8,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIdI7taRm6qPCce+sluv977emQQ4z7gCph/DR0pvg04/aHJwyMIEVoRnz5wLU0SRwxdOqJC5ddydVdW88lYaY1o+leeVFXjXF/wotGBrph04WmrDOtEw8bry3w2Y67XcbA==',key_name='tempest-TestNetworkBasicOps-426589612',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-nseso6md',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:26:35Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e101f1cf-8cb1-4383-b245-9f89d838a2e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.589 186993 DEBUG nova.network.os_vif_util [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.589 186993 DEBUG nova.network.os_vif_util [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.590 186993 DEBUG os_vif [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.590 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.591 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.591 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.594 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.595 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23c0526d-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.595 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23c0526d-3a, col_values=(('external_ids', {'iface-id': '23c0526d-3a07-4045-b515-64f7987d3bec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:60:01', 'vm-uuid': 'e101f1cf-8cb1-4383-b245-9f89d838a2e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.599 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:26:40 compute-0 NetworkManager[55541]: <info>  [1765362400.6001] manager: (tap23c0526d-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.605 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.606 186993 INFO os_vif [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a')
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.682 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.682 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.682 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:7b:60:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:26:40 compute-0 nova_compute[186989]: 2025-12-10 10:26:40.683 186993 INFO nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Using config drive
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.473 186993 INFO nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Creating config drive at /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.config
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.484 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux8sslmz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.623 186993 DEBUG oslo_concurrency.processutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpux8sslmz" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:41 compute-0 kernel: tap23c0526d-3a: entered promiscuous mode
Dec 10 10:26:41 compute-0 NetworkManager[55541]: <info>  [1765362401.7134] manager: (tap23c0526d-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Dec 10 10:26:41 compute-0 ovn_controller[95452]: 2025-12-10T10:26:41Z|00111|binding|INFO|Claiming lport 23c0526d-3a07-4045-b515-64f7987d3bec for this chassis.
Dec 10 10:26:41 compute-0 ovn_controller[95452]: 2025-12-10T10:26:41Z|00112|binding|INFO|23c0526d-3a07-4045-b515-64f7987d3bec: Claiming fa:16:3e:7b:60:01 10.100.0.13
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.715 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.719 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.728 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.734 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:60:01 10.100.0.13'], port_security=['fa:16:3e:7b:60:01 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e101f1cf-8cb1-4383-b245-9f89d838a2e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302e70e8-d520-4dc6-aad3-73a28bc17154, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=23c0526d-3a07-4045-b515-64f7987d3bec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.735 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 23c0526d-3a07-4045-b515-64f7987d3bec in datapath f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 bound to our chassis
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.736 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:41 compute-0 systemd-udevd[216891]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.754 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc0d608-7335-499b-ab2b-7b1e3a95172e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.755 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6c2a93d-71 in ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.757 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6c2a93d-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.757 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f510e255-66dd-4f57-869a-ec29ea4a61bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.758 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[85e52ba2-c537-4a69-90cc-9c04c52f95f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 systemd-machined[153379]: New machine qemu-8-instance-00000008.
Dec 10 10:26:41 compute-0 NetworkManager[55541]: <info>  [1765362401.7717] device (tap23c0526d-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:26:41 compute-0 NetworkManager[55541]: <info>  [1765362401.7729] device (tap23c0526d-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.772 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[b6027fa9-d2a8-4097-8646-e07891b8a4ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.780 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:41 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Dec 10 10:26:41 compute-0 ovn_controller[95452]: 2025-12-10T10:26:41Z|00113|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec ovn-installed in OVS
Dec 10 10:26:41 compute-0 ovn_controller[95452]: 2025-12-10T10:26:41Z|00114|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec up in Southbound
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.786 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.794 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[09656310-b902-4204-9335-32f6992dbc47]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 podman[216871]: 2025-12-10 10:26:41.799119497 +0000 UTC m=+0.095182608 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.829 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1095f3ba-a74a-4e76-896d-0e4e83390df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 NetworkManager[55541]: <info>  [1765362401.8394] manager: (tapf6c2a93d-70): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.839 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f362cc20-2c2c-4a12-a311-a5f9092eaf73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.873 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[fe141854-1adc-40f6-bc04-cdd1638ef74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.878 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ecd80a-611c-44db-a40b-b6222ee6f8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 NetworkManager[55541]: <info>  [1765362401.9040] device (tapf6c2a93d-70): carrier: link connected
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.911 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9bfb08-1b3c-4d20-817a-7536baa09a0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.929 186993 DEBUG nova.network.neutron [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updated VIF entry in instance network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.930 186993 DEBUG nova.network.neutron [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updating instance_info_cache with network_info: [{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.930 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e0f67f-c0d9-41e5-bac1-b0b84dabb1d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6c2a93d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:76:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341712, 'reachable_time': 28003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216935, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.948 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9816be82-7970-4428-87d1-74768c15dacb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:76e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341712, 'tstamp': 341712}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216938, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.952 186993 DEBUG oslo_concurrency.lockutils [req-fbc2c9b6-1ece-43a8-8db2-3ea291d2dd68 req-bd0ebc37-afbd-4a60-bdb2-b4ac0fa8591a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:41.967 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[16bfcc43-8eea-4caf-a0b2-b545ef10da58]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6c2a93d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:76:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341712, 'reachable_time': 28003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216943, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.985 186993 DEBUG nova.compute.manager [req-2dd53ebf-7b98-4be2-9df8-1547963f37a3 req-0e6eda23-ec39-4d8a-bf07-dbd756207b7b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.986 186993 DEBUG oslo_concurrency.lockutils [req-2dd53ebf-7b98-4be2-9df8-1547963f37a3 req-0e6eda23-ec39-4d8a-bf07-dbd756207b7b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.987 186993 DEBUG oslo_concurrency.lockutils [req-2dd53ebf-7b98-4be2-9df8-1547963f37a3 req-0e6eda23-ec39-4d8a-bf07-dbd756207b7b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.987 186993 DEBUG oslo_concurrency.lockutils [req-2dd53ebf-7b98-4be2-9df8-1547963f37a3 req-0e6eda23-ec39-4d8a-bf07-dbd756207b7b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:41 compute-0 nova_compute[186989]: 2025-12-10 10:26:41.987 186993 DEBUG nova.compute.manager [req-2dd53ebf-7b98-4be2-9df8-1547963f37a3 req-0e6eda23-ec39-4d8a-bf07-dbd756207b7b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Processing event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.008 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8aca2585-7708-4401-9fd9-eb76c3b1a2df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.045 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362402.0445027, e101f1cf-8cb1-4383-b245-9f89d838a2e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.046 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] VM Started (Lifecycle Event)
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.048 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.052 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.056 186993 INFO nova.virt.libvirt.driver [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Instance spawned successfully.
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.056 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.069 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.075 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.080 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.081 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.081 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.082 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.082 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.083 186993 DEBUG nova.virt.libvirt.driver [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.097 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[0856d678-6124-46c6-bd7c-3f4dd398e82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.099 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6c2a93d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.100 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.100 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6c2a93d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.102 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 NetworkManager[55541]: <info>  [1765362402.1031] manager: (tapf6c2a93d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 10 10:26:42 compute-0 kernel: tapf6c2a93d-70: entered promiscuous mode
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.105 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.105 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362402.0446286, e101f1cf-8cb1-4383-b245-9f89d838a2e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.106 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] VM Paused (Lifecycle Event)
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.107 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.109 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6c2a93d-70, col_values=(('external_ids', {'iface-id': 'd517836d-4035-4dbb-836e-b4df0be637a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.110 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 ovn_controller[95452]: 2025-12-10T10:26:42Z|00115|binding|INFO|Releasing lport d517836d-4035-4dbb-836e-b4df0be637a5 from this chassis (sb_readonly=0)
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.111 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.114 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.115 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8de6eff8-148a-428c-9ae8-ed7152a01ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.116 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.117 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'env', 'PROCESS_TAG=haproxy-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.127 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.145 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.149 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362402.0512516, e101f1cf-8cb1-4383-b245-9f89d838a2e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.150 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] VM Resumed (Lifecycle Event)
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.156 186993 INFO nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Took 6.40 seconds to spawn the instance on the hypervisor.
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.156 186993 DEBUG nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.167 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.170 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.189 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.218 186993 INFO nova.compute.manager [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Took 7.00 seconds to build instance.
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.235 186993 DEBUG oslo_concurrency.lockutils [None req-dd7d4668-81d3-4aff-be2d-9d8d80d03daf 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.544 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:42 compute-0 podman[216976]: 2025-12-10 10:26:42.565457833 +0000 UTC m=+0.064499980 container create 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.572 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 systemd[1]: Started libpod-conmon-9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936.scope.
Dec 10 10:26:42 compute-0 podman[216976]: 2025-12-10 10:26:42.527410465 +0000 UTC m=+0.026452622 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:26:42 compute-0 nova_compute[186989]: 2025-12-10 10:26:42.641 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:42 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:26:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0493041bab572087cdf07c30781ef5076ce4a061ba90f2bd274e808412d2024/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:26:42 compute-0 podman[216976]: 2025-12-10 10:26:42.680675527 +0000 UTC m=+0.179717654 container init 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 10 10:26:42 compute-0 podman[216976]: 2025-12-10 10:26:42.687072971 +0000 UTC m=+0.186115088 container start 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:26:42 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [NOTICE]   (216995) : New worker (216997) forked
Dec 10 10:26:42 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [NOTICE]   (216995) : Loading success.
Dec 10 10:26:42 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:42.764 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.064 186993 DEBUG nova.compute.manager [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.066 186993 DEBUG oslo_concurrency.lockutils [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.066 186993 DEBUG oslo_concurrency.lockutils [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.066 186993 DEBUG oslo_concurrency.lockutils [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.067 186993 DEBUG nova.compute.manager [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] No waiting events found dispatching network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.067 186993 WARNING nova.compute.manager [req-e9598fa7-c050-44f5-9414-7eb3a24dd401 req-cafd6884-e401-4fd9-b6e4-c7f9dd531793 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received unexpected event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with vm_state active and task_state None.
Dec 10 10:26:44 compute-0 ovn_controller[95452]: 2025-12-10T10:26:44Z|00116|binding|INFO|Releasing lport d517836d-4035-4dbb-836e-b4df0be637a5 from this chassis (sb_readonly=0)
Dec 10 10:26:44 compute-0 NetworkManager[55541]: <info>  [1765362404.7601] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.760 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:44 compute-0 NetworkManager[55541]: <info>  [1765362404.7614] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Dec 10 10:26:44 compute-0 nova_compute[186989]: 2025-12-10 10:26:44.845 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:44 compute-0 ovn_controller[95452]: 2025-12-10T10:26:44Z|00117|binding|INFO|Releasing lport d517836d-4035-4dbb-836e-b4df0be637a5 from this chassis (sb_readonly=0)
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.170 186993 DEBUG nova.compute.manager [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.172 186993 DEBUG nova.compute.manager [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Refreshing instance network info cache due to event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.173 186993 DEBUG oslo_concurrency.lockutils [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.173 186993 DEBUG oslo_concurrency.lockutils [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.174 186993 DEBUG nova.network.neutron [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Refreshing network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.330 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.331 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.332 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.332 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.333 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.335 186993 INFO nova.compute.manager [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Terminating instance
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.337 186993 DEBUG nova.compute.manager [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:26:45 compute-0 kernel: tap23c0526d-3a (unregistering): left promiscuous mode
Dec 10 10:26:45 compute-0 NetworkManager[55541]: <info>  [1765362405.3658] device (tap23c0526d-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:26:45 compute-0 ovn_controller[95452]: 2025-12-10T10:26:45Z|00118|binding|INFO|Releasing lport 23c0526d-3a07-4045-b515-64f7987d3bec from this chassis (sb_readonly=0)
Dec 10 10:26:45 compute-0 ovn_controller[95452]: 2025-12-10T10:26:45Z|00119|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec down in Southbound
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.375 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 ovn_controller[95452]: 2025-12-10T10:26:45Z|00120|binding|INFO|Removing iface tap23c0526d-3a ovn-installed in OVS
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.378 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.385 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:60:01 10.100.0.13'], port_security=['fa:16:3e:7b:60:01 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e101f1cf-8cb1-4383-b245-9f89d838a2e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302e70e8-d520-4dc6-aad3-73a28bc17154, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=23c0526d-3a07-4045-b515-64f7987d3bec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.388 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 23c0526d-3a07-4045-b515-64f7987d3bec in datapath f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 unbound from our chassis
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.391 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.393 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8cee06a0-31c3-4c75-a092-fd6050cd1a7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.394 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 namespace which is not needed anymore
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.417 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 10 10:26:45 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 3.561s CPU time.
Dec 10 10:26:45 compute-0 systemd-machined[153379]: Machine qemu-8-instance-00000008 terminated.
Dec 10 10:26:45 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [NOTICE]   (216995) : haproxy version is 2.8.14-c23fe91
Dec 10 10:26:45 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [NOTICE]   (216995) : path to executable is /usr/sbin/haproxy
Dec 10 10:26:45 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [WARNING]  (216995) : Exiting Master process...
Dec 10 10:26:45 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [ALERT]    (216995) : Current worker (216997) exited with code 143 (Terminated)
Dec 10 10:26:45 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[216991]: [WARNING]  (216995) : All workers exited. Exiting... (0)
Dec 10 10:26:45 compute-0 systemd[1]: libpod-9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936.scope: Deactivated successfully.
Dec 10 10:26:45 compute-0 podman[217031]: 2025-12-10 10:26:45.554965653 +0000 UTC m=+0.054940560 container died 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 10 10:26:45 compute-0 kernel: tap23c0526d-3a: entered promiscuous mode
Dec 10 10:26:45 compute-0 kernel: tap23c0526d-3a (unregistering): left promiscuous mode
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.575 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936-userdata-shm.mount: Deactivated successfully.
Dec 10 10:26:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-d0493041bab572087cdf07c30781ef5076ce4a061ba90f2bd274e808412d2024-merged.mount: Deactivated successfully.
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.603 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 podman[217031]: 2025-12-10 10:26:45.617425237 +0000 UTC m=+0.117400124 container cleanup 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.621 186993 INFO nova.virt.libvirt.driver [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Instance destroyed successfully.
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.622 186993 DEBUG nova.objects.instance [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid e101f1cf-8cb1-4383-b245-9f89d838a2e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.623 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e101f1cf-8cb1-4383-b245-9f89d838a2e5', 'name': 'tempest-TestNetworkBasicOps-server-1905495063', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000008', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '82da19f85bb840d2a70395c3d761ef38', 'user_id': '603f9c3a99e145e4a64248329321a249', 'hostId': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.625 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.627 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.628 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.629 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.630 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.631 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.631 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.631 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>]
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.632 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.633 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.633 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.634 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.636 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.636 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 10 10:26:45 compute-0 systemd[1]: libpod-conmon-9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936.scope: Deactivated successfully.
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.637 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.637 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.637 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.637 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>]
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.638 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.638 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>]
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.639 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.641 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.642 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.643 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.644 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.645 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.645 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.646 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.646 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.646 186993 DEBUG nova.virt.libvirt.vif [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:26:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1905495063',display_name='tempest-TestNetworkBasicOps-server-1905495063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1905495063',id=8,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIdI7taRm6qPCce+sluv977emQQ4z7gCph/DR0pvg04/aHJwyMIEVoRnz5wLU0SRwxdOqJC5ddydVdW88lYaY1o+leeVFXjXF/wotGBrph04WmrDOtEw8bry3w2Y67XcbA==',key_name='tempest-TestNetworkBasicOps-426589612',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:26:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-nseso6md',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:26:42Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e101f1cf-8cb1-4383-b245-9f89d838a2e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.647 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.647 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.647 186993 DEBUG nova.network.os_vif_util [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.648 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.649 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.649 186993 DEBUG nova.network.os_vif_util [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.650 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.650 186993 DEBUG os_vif [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.651 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.652 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.652 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.652 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1905495063>]
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.653 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.653 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.654 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.654 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.655 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 10 10:26:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:26:45.656 12 DEBUG ceilometer.compute.pollsters [-] Instance e101f1cf-8cb1-4383-b245-9f89d838a2e5 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000008, id=e101f1cf-8cb1-4383-b245-9f89d838a2e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.656 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.657 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23c0526d-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.659 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.661 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.664 186993 INFO os_vif [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a')
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.665 186993 INFO nova.virt.libvirt.driver [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Deleting instance files /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5_del
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.666 186993 INFO nova.virt.libvirt.driver [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Deletion of /var/lib/nova/instances/e101f1cf-8cb1-4383-b245-9f89d838a2e5_del complete
Dec 10 10:26:45 compute-0 podman[217078]: 2025-12-10 10:26:45.705860839 +0000 UTC m=+0.053254443 container remove 9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.713 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cc8cf8-3d63-4b37-aa0e-0da9a78ff509]: (4, ('Wed Dec 10 10:26:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 (9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936)\n9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936\nWed Dec 10 10:26:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 (9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936)\n9b3ae0e43596edb77ba5ed9657da52e9f80153ef084c83672f5ba92dfdc64936\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.717 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[de68bf1e-2dc7-45a5-b40e-be5d2f9596e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.720 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6c2a93d-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:45 compute-0 kernel: tapf6c2a93d-70: left promiscuous mode
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.723 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.729 186993 INFO nova.compute.manager [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.730 186993 DEBUG oslo.service.loopingcall [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.730 186993 DEBUG nova.compute.manager [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.730 186993 DEBUG nova.network.neutron [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:26:45 compute-0 nova_compute[186989]: 2025-12-10 10:26:45.738 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.740 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c99532da-f14d-4d55-8d2c-1bc0bd7b6ebc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.757 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[647c8248-d04e-4d31-8aa5-c73a4480915d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.759 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f6427203-32cb-4136-b50e-03b6d645e592]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.765 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.784 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5c335b3b-f593-4af2-baa5-30e3d7a3443a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341704, 'reachable_time': 43429, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217093, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.787 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:26:45 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:45.787 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[7336894a-c56a-4e3e-a20e-f384701334cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:45 compute-0 systemd[1]: run-netns-ovnmeta\x2df6c2a93d\x2d7c6f\x2d44cf\x2da6a8\x2d2dc701b77fb6.mount: Deactivated successfully.
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.130 186993 DEBUG nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.131 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.132 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.133 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.133 186993 DEBUG nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] No waiting events found dispatching network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.133 186993 DEBUG nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.133 186993 DEBUG nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.133 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.134 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.134 186993 DEBUG oslo_concurrency.lockutils [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.134 186993 DEBUG nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] No waiting events found dispatching network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.134 186993 WARNING nova.compute.manager [req-119505ee-3ee1-4ce4-a152-21697df332dd req-fe703bae-3dee-4571-b9f7-379097051064 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Received unexpected event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with vm_state active and task_state deleting.
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.320 186993 DEBUG nova.network.neutron [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updated VIF entry in instance network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.321 186993 DEBUG nova.network.neutron [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updating instance_info_cache with network_info: [{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.338 186993 DEBUG oslo_concurrency.lockutils [req-6f11d342-3519-474f-937b-3ada36199f57 req-de3670c7-c45d-4a72-a138-2407a6fd85bb 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-e101f1cf-8cb1-4383-b245-9f89d838a2e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.849 186993 DEBUG nova.network.neutron [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.883 186993 INFO nova.compute.manager [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Took 1.15 seconds to deallocate network for instance.
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.927 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.927 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:46 compute-0 nova_compute[186989]: 2025-12-10 10:26:46.933 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.004 186993 DEBUG nova.compute.provider_tree [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.025 186993 DEBUG nova.scheduler.client.report [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.049 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.086 186993 INFO nova.scheduler.client.report [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance e101f1cf-8cb1-4383-b245-9f89d838a2e5
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.157 186993 DEBUG oslo_concurrency.lockutils [None req-80484d50-9b91-4ad2-a6b6-07a8e6db122d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e101f1cf-8cb1-4383-b245-9f89d838a2e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:47 compute-0 nova_compute[186989]: 2025-12-10 10:26:47.649 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:50 compute-0 nova_compute[186989]: 2025-12-10 10:26:50.660 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:50 compute-0 nova_compute[186989]: 2025-12-10 10:26:50.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:50 compute-0 nova_compute[186989]: 2025-12-10 10:26:50.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:26:51 compute-0 nova_compute[186989]: 2025-12-10 10:26:51.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:52 compute-0 nova_compute[186989]: 2025-12-10 10:26:52.652 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:52 compute-0 nova_compute[186989]: 2025-12-10 10:26:52.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:52 compute-0 nova_compute[186989]: 2025-12-10 10:26:52.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:53 compute-0 podman[217095]: 2025-12-10 10:26:53.076857582 +0000 UTC m=+0.105630483 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:26:53 compute-0 nova_compute[186989]: 2025-12-10 10:26:53.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.872 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.873 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.890 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.965 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.965 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.987 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.988 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.988 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.988 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.990 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:54 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.991 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:54.999 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.000 186993 INFO nova.compute.claims [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:26:55 compute-0 podman[217121]: 2025-12-10 10:26:55.034452468 +0000 UTC m=+0.082047769 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.109 186993 DEBUG nova.compute.provider_tree [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.129 186993 DEBUG nova.scheduler.client.report [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.152 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.153 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.201 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.202 186993 DEBUG nova.network.neutron [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.232 186993 INFO nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.237 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.238 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.32991027832031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.238 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.239 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.254 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.308 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance e71a25fc-84a8-47f2-9cd4-00f608ed48a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.308 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.308 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.337 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.339 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.339 186993 INFO nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Creating image(s)
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.340 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.340 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.341 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.354 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.387 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.402 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.425 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.426 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.435 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.435 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.436 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.449 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.510 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.511 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.551 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.552 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.553 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.600 186993 DEBUG nova.policy [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.611 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.612 186993 DEBUG nova.virt.disk.api [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.612 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.663 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.701 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.702 186993 DEBUG nova.virt.disk.api [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.703 186993 DEBUG nova.objects.instance [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid e71a25fc-84a8-47f2-9cd4-00f608ed48a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.722 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.722 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Ensure instance console log exists: /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.723 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.723 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:55 compute-0 nova_compute[186989]: 2025-12-10 10:26:55.724 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.422 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.734 186993 DEBUG nova.network.neutron [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Successfully updated port: 23c0526d-3a07-4045-b515-64f7987d3bec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.751 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.752 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.753 186993 DEBUG nova.network.neutron [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.841 186993 DEBUG nova.compute.manager [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.841 186993 DEBUG nova.compute.manager [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Refreshing instance network info cache due to event network-changed-23c0526d-3a07-4045-b515-64f7987d3bec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.842 186993 DEBUG oslo_concurrency.lockutils [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:26:56 compute-0 nova_compute[186989]: 2025-12-10 10:26:56.939 186993 DEBUG nova.network.neutron [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.668 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.787 186993 DEBUG nova.network.neutron [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Updating instance_info_cache with network_info: [{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.806 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.807 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Instance network_info: |[{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.807 186993 DEBUG oslo_concurrency.lockutils [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.807 186993 DEBUG nova.network.neutron [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Refreshing network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.810 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Start _get_guest_xml network_info=[{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.815 186993 WARNING nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.820 186993 DEBUG nova.virt.libvirt.host [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.820 186993 DEBUG nova.virt.libvirt.host [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.827 186993 DEBUG nova.virt.libvirt.host [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.828 186993 DEBUG nova.virt.libvirt.host [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.828 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.829 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.829 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.829 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.830 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.830 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.830 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.830 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.831 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.831 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.831 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.831 186993 DEBUG nova.virt.hardware [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.835 186993 DEBUG nova.virt.libvirt.vif [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-382428463',display_name='tempest-TestNetworkBasicOps-server-382428463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-382428463',id=9,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKB66WzJZfDWRHyBR/68FAa3zGcU3UZ9Su8ltkpWAmQkgvyvTR/4yh2ddOEBkS2TJuHUUvJtqq5rmVaMCAB55HJvgsa2YberQZuH+i61UiYEuA3NcjJluQ5omV4DuIPsgQ==',key_name='tempest-TestNetworkBasicOps-359933792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-lodmxwp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:26:55Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e71a25fc-84a8-47f2-9cd4-00f608ed48a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.835 186993 DEBUG nova.network.os_vif_util [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.836 186993 DEBUG nova.network.os_vif_util [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.837 186993 DEBUG nova.objects.instance [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid e71a25fc-84a8-47f2-9cd4-00f608ed48a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.852 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <uuid>e71a25fc-84a8-47f2-9cd4-00f608ed48a5</uuid>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <name>instance-00000009</name>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-382428463</nova:name>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:26:57</nova:creationTime>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         <nova:port uuid="23c0526d-3a07-4045-b515-64f7987d3bec">
Dec 10 10:26:57 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <system>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="serial">e71a25fc-84a8-47f2-9cd4-00f608ed48a5</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="uuid">e71a25fc-84a8-47f2-9cd4-00f608ed48a5</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </system>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <os>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </os>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <features>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </features>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.config"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:7b:60:01"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <target dev="tap23c0526d-3a"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/console.log" append="off"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <video>
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </video>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:26:57 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:26:57 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:26:57 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:26:57 compute-0 nova_compute[186989]: </domain>
Dec 10 10:26:57 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.854 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Preparing to wait for external event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.855 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.855 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.856 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.856 186993 DEBUG nova.virt.libvirt.vif [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-382428463',display_name='tempest-TestNetworkBasicOps-server-382428463',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-382428463',id=9,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKB66WzJZfDWRHyBR/68FAa3zGcU3UZ9Su8ltkpWAmQkgvyvTR/4yh2ddOEBkS2TJuHUUvJtqq5rmVaMCAB55HJvgsa2YberQZuH+i61UiYEuA3NcjJluQ5omV4DuIPsgQ==',key_name='tempest-TestNetworkBasicOps-359933792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-lodmxwp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:26:55Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e71a25fc-84a8-47f2-9cd4-00f608ed48a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.857 186993 DEBUG nova.network.os_vif_util [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.858 186993 DEBUG nova.network.os_vif_util [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.858 186993 DEBUG os_vif [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.859 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.859 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.860 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.863 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.863 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23c0526d-3a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.864 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap23c0526d-3a, col_values=(('external_ids', {'iface-id': '23c0526d-3a07-4045-b515-64f7987d3bec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:60:01', 'vm-uuid': 'e71a25fc-84a8-47f2-9cd4-00f608ed48a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.866 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:57 compute-0 NetworkManager[55541]: <info>  [1765362417.8669] manager: (tap23c0526d-3a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.869 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.872 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.874 186993 INFO os_vif [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a')
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.929 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.930 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.930 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:7b:60:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:26:57 compute-0 nova_compute[186989]: 2025-12-10 10:26:57.931 186993 INFO nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Using config drive
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.739 186993 INFO nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Creating config drive at /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.config
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.745 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqfajebq6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.883 186993 DEBUG oslo_concurrency.processutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqfajebq6" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:26:58 compute-0 kernel: tap23c0526d-3a: entered promiscuous mode
Dec 10 10:26:58 compute-0 NetworkManager[55541]: <info>  [1765362418.9534] manager: (tap23c0526d-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec 10 10:26:58 compute-0 ovn_controller[95452]: 2025-12-10T10:26:58Z|00121|binding|INFO|Claiming lport 23c0526d-3a07-4045-b515-64f7987d3bec for this chassis.
Dec 10 10:26:58 compute-0 ovn_controller[95452]: 2025-12-10T10:26:58Z|00122|binding|INFO|23c0526d-3a07-4045-b515-64f7987d3bec: Claiming fa:16:3e:7b:60:01 10.100.0.13
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.954 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:58 compute-0 ovn_controller[95452]: 2025-12-10T10:26:58Z|00123|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec ovn-installed in OVS
Dec 10 10:26:58 compute-0 ovn_controller[95452]: 2025-12-10T10:26:58Z|00124|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec up in Southbound
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.969 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:60:01 10.100.0.13'], port_security=['fa:16:3e:7b:60:01 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e71a25fc-84a8-47f2-9cd4-00f608ed48a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '7', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.245'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302e70e8-d520-4dc6-aad3-73a28bc17154, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=23c0526d-3a07-4045-b515-64f7987d3bec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.970 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.971 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 23c0526d-3a07-4045-b515-64f7987d3bec in datapath f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 bound to our chassis
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.972 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:58 compute-0 nova_compute[186989]: 2025-12-10 10:26:58.976 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.984 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca8c49e-9878-47e9-8c81-1149b3d81f50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.986 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6c2a93d-71 in ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.988 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6c2a93d-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.988 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[02130c95-4257-46ae-ae40-4caaca098b81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:58 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:58.989 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d9672a49-7c6a-4cf5-9857-9cfa6c7b777a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.002 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e9ba1c-d1b9-4b01-b59b-d997decdc3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 systemd-machined[153379]: New machine qemu-9-instance-00000009.
Dec 10 10:26:59 compute-0 systemd-udevd[217178]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.019 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d6060213-0dd8-4e04-8548-59be82e64018]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 NetworkManager[55541]: <info>  [1765362419.0239] device (tap23c0526d-3a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:26:59 compute-0 NetworkManager[55541]: <info>  [1765362419.0255] device (tap23c0526d-3a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:26:59 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.053 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[81a10b76-d998-44c9-a6ee-34b57aa49129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 NetworkManager[55541]: <info>  [1765362419.0608] manager: (tapf6c2a93d-70): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.060 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4f95c12c-b6bb-47e8-9df3-37602ac0d678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 systemd-udevd[217181]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.093 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[20ef4cbf-7a5c-43a0-97e5-7b23f4f0e17d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.097 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[3c428005-fc64-4f89-8fcc-b7105d5df0a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 NetworkManager[55541]: <info>  [1765362419.1198] device (tapf6c2a93d-70): carrier: link connected
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.124 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c33f5364-16d9-47e9-ac1a-1d31070b5e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.144 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ab8588-6240-444f-9fb3-6d1367b4679b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6c2a93d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:76:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343433, 'reachable_time': 21295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217210, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.164 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a7de7a51-4917-48a7-bd45-5f6f60121b93]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:76e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 343433, 'tstamp': 343433}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217211, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.167 186993 DEBUG nova.compute.manager [req-af1d3a28-2a7e-4832-a369-1ebb7bef6805 req-c6374777-f349-4d56-8673-7166e9ef5dff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.168 186993 DEBUG oslo_concurrency.lockutils [req-af1d3a28-2a7e-4832-a369-1ebb7bef6805 req-c6374777-f349-4d56-8673-7166e9ef5dff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.168 186993 DEBUG oslo_concurrency.lockutils [req-af1d3a28-2a7e-4832-a369-1ebb7bef6805 req-c6374777-f349-4d56-8673-7166e9ef5dff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.168 186993 DEBUG oslo_concurrency.lockutils [req-af1d3a28-2a7e-4832-a369-1ebb7bef6805 req-c6374777-f349-4d56-8673-7166e9ef5dff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.169 186993 DEBUG nova.compute.manager [req-af1d3a28-2a7e-4832-a369-1ebb7bef6805 req-c6374777-f349-4d56-8673-7166e9ef5dff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Processing event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.182 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[314ec08a-a4e1-4148-af67-c0f7ed5e5a38]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6c2a93d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:76:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343433, 'reachable_time': 21295, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217212, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.218 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[596df2c0-685f-4e69-9f71-89c4d2d3ebd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.252 186993 DEBUG nova.network.neutron [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Updated VIF entry in instance network info cache for port 23c0526d-3a07-4045-b515-64f7987d3bec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.253 186993 DEBUG nova.network.neutron [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Updating instance_info_cache with network_info: [{"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.268 186993 DEBUG oslo_concurrency.lockutils [req-f97cdbfe-54f5-421a-84c6-15d724432629 req-c4596bfc-2be6-42f3-a96e-329538797dfd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-e71a25fc-84a8-47f2-9cd4-00f608ed48a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.287 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[071d77eb-ae0c-487c-827d-06d378920d04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.288 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6c2a93d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.289 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.289 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6c2a93d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.291 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:59 compute-0 NetworkManager[55541]: <info>  [1765362419.2924] manager: (tapf6c2a93d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 10 10:26:59 compute-0 kernel: tapf6c2a93d-70: entered promiscuous mode
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.295 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6c2a93d-70, col_values=(('external_ids', {'iface-id': 'd517836d-4035-4dbb-836e-b4df0be637a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:26:59 compute-0 ovn_controller[95452]: 2025-12-10T10:26:59Z|00125|binding|INFO|Releasing lport d517836d-4035-4dbb-836e-b4df0be637a5 from this chassis (sb_readonly=0)
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.297 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.297 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.298 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.299 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f9db66a3-912e-44ff-8b28-f29f30033e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.300 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.pid.haproxy
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:26:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:26:59.300 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'env', 'PROCESS_TAG=haproxy-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.309 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:26:59 compute-0 podman[217244]: 2025-12-10 10:26:59.681264992 +0000 UTC m=+0.065298763 container create d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 10 10:26:59 compute-0 systemd[1]: Started libpod-conmon-d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d.scope.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.716 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.718 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362419.715845, e71a25fc-84a8-47f2-9cd4-00f608ed48a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.718 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] VM Started (Lifecycle Event)
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.721 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.726 186993 INFO nova.virt.libvirt.driver [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Instance spawned successfully.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.727 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:26:59 compute-0 podman[217244]: 2025-12-10 10:26:59.649215247 +0000 UTC m=+0.033249018 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.746 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.752 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:26:59 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.755 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.756 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.757 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.757 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.757 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.758 186993 DEBUG nova.virt.libvirt.driver [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:26:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1770f43dc5b95deee4d6630e64a29757ba1f97370426058428155352802bdf0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:26:59 compute-0 podman[217244]: 2025-12-10 10:26:59.773061396 +0000 UTC m=+0.157095197 container init d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 10 10:26:59 compute-0 podman[217244]: 2025-12-10 10:26:59.77943508 +0000 UTC m=+0.163468851 container start d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.785 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.785 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362419.7160776, e71a25fc-84a8-47f2-9cd4-00f608ed48a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.786 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] VM Paused (Lifecycle Event)
Dec 10 10:26:59 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [NOTICE]   (217270) : New worker (217272) forked
Dec 10 10:26:59 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [NOTICE]   (217270) : Loading success.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.818 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.824 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362419.7210522, e71a25fc-84a8-47f2-9cd4-00f608ed48a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.825 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] VM Resumed (Lifecycle Event)
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.830 186993 INFO nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Took 4.49 seconds to spawn the instance on the hypervisor.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.831 186993 DEBUG nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.842 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.846 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.869 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.887 186993 INFO nova.compute.manager [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Took 4.93 seconds to build instance.
Dec 10 10:26:59 compute-0 nova_compute[186989]: 2025-12-10 10:26:59.903 186993 DEBUG oslo_concurrency.lockutils [None req-f74249ec-6b32-45a6-8911-ad67b1b0540a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.618 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362405.616653, e101f1cf-8cb1-4383-b245-9f89d838a2e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.619 186993 INFO nova.compute.manager [-] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] VM Stopped (Lifecycle Event)
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.640 186993 DEBUG nova.compute.manager [None req-c6e6ab7c-c1d5-4914-ad99-009bb27520e8 - - - - - -] [instance: e101f1cf-8cb1-4383-b245-9f89d838a2e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.797 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.799 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.799 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.800 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.800 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.802 186993 INFO nova.compute.manager [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Terminating instance
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.803 186993 DEBUG nova.compute.manager [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:27:00 compute-0 kernel: tap23c0526d-3a (unregistering): left promiscuous mode
Dec 10 10:27:00 compute-0 NetworkManager[55541]: <info>  [1765362420.8215] device (tap23c0526d-3a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:27:00 compute-0 ovn_controller[95452]: 2025-12-10T10:27:00Z|00126|binding|INFO|Releasing lport 23c0526d-3a07-4045-b515-64f7987d3bec from this chassis (sb_readonly=0)
Dec 10 10:27:00 compute-0 ovn_controller[95452]: 2025-12-10T10:27:00Z|00127|binding|INFO|Setting lport 23c0526d-3a07-4045-b515-64f7987d3bec down in Southbound
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.827 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:00 compute-0 ovn_controller[95452]: 2025-12-10T10:27:00Z|00128|binding|INFO|Removing iface tap23c0526d-3a ovn-installed in OVS
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.829 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:00.836 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:60:01 10.100.0.13'], port_security=['fa:16:3e:7b:60:01 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e71a25fc-84a8-47f2-9cd4-00f608ed48a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-653781637', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '9', 'neutron:security_group_ids': '796e6156-6d8e-4cf4-b04a-830fa4553503', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.245', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302e70e8-d520-4dc6-aad3-73a28bc17154, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=23c0526d-3a07-4045-b515-64f7987d3bec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:27:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:00.838 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 23c0526d-3a07-4045-b515-64f7987d3bec in datapath f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 unbound from our chassis
Dec 10 10:27:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:00.839 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:27:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:00.840 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bc762c68-49be-4e7c-a0e6-cf29fd96fc39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:00.841 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 namespace which is not needed anymore
Dec 10 10:27:00 compute-0 nova_compute[186989]: 2025-12-10 10:27:00.845 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:00 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 10 10:27:00 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 1.762s CPU time.
Dec 10 10:27:00 compute-0 systemd-machined[153379]: Machine qemu-9-instance-00000009 terminated.
Dec 10 10:27:00 compute-0 podman[217283]: 2025-12-10 10:27:00.932901059 +0000 UTC m=+0.070503995 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 10 10:27:00 compute-0 podman[217287]: 2025-12-10 10:27:00.939991442 +0000 UTC m=+0.079904151 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 10 10:27:00 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [NOTICE]   (217270) : haproxy version is 2.8.14-c23fe91
Dec 10 10:27:00 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [NOTICE]   (217270) : path to executable is /usr/sbin/haproxy
Dec 10 10:27:00 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [WARNING]  (217270) : Exiting Master process...
Dec 10 10:27:00 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [ALERT]    (217270) : Current worker (217272) exited with code 143 (Terminated)
Dec 10 10:27:00 compute-0 neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6[217266]: [WARNING]  (217270) : All workers exited. Exiting... (0)
Dec 10 10:27:00 compute-0 systemd[1]: libpod-d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d.scope: Deactivated successfully.
Dec 10 10:27:00 compute-0 podman[217289]: 2025-12-10 10:27:00.991941779 +0000 UTC m=+0.126958065 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:27:00 compute-0 podman[217347]: 2025-12-10 10:27:00.997085299 +0000 UTC m=+0.068218502 container died d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:27:01 compute-0 NetworkManager[55541]: <info>  [1765362421.0265] manager: (tap23c0526d-3a): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec 10 10:27:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d-userdata-shm.mount: Deactivated successfully.
Dec 10 10:27:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-b1770f43dc5b95deee4d6630e64a29757ba1f97370426058428155352802bdf0-merged.mount: Deactivated successfully.
Dec 10 10:27:01 compute-0 podman[217347]: 2025-12-10 10:27:01.039572459 +0000 UTC m=+0.110705662 container cleanup d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:27:01 compute-0 systemd[1]: libpod-conmon-d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d.scope: Deactivated successfully.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.066 186993 INFO nova.virt.libvirt.driver [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Instance destroyed successfully.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.066 186993 DEBUG nova.objects.instance [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid e71a25fc-84a8-47f2-9cd4-00f608ed48a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.081 186993 DEBUG nova.virt.libvirt.vif [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:26:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-382428463',display_name='tempest-TestNetworkBasicOps-server-382428463',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-382428463',id=9,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKB66WzJZfDWRHyBR/68FAa3zGcU3UZ9Su8ltkpWAmQkgvyvTR/4yh2ddOEBkS2TJuHUUvJtqq5rmVaMCAB55HJvgsa2YberQZuH+i61UiYEuA3NcjJluQ5omV4DuIPsgQ==',key_name='tempest-TestNetworkBasicOps-359933792',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:26:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-lodmxwp9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:26:59Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=e71a25fc-84a8-47f2-9cd4-00f608ed48a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.081 186993 DEBUG nova.network.os_vif_util [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "23c0526d-3a07-4045-b515-64f7987d3bec", "address": "fa:16:3e:7b:60:01", "network": {"id": "f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6", "bridge": "br-int", "label": "tempest-network-smoke--2032641177", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23c0526d-3a", "ovs_interfaceid": "23c0526d-3a07-4045-b515-64f7987d3bec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.082 186993 DEBUG nova.network.os_vif_util [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.082 186993 DEBUG os_vif [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.084 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.084 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23c0526d-3a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.085 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.087 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.089 186993 INFO os_vif [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:60:01,bridge_name='br-int',has_traffic_filtering=True,id=23c0526d-3a07-4045-b515-64f7987d3bec,network=Network(f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap23c0526d-3a')
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.090 186993 INFO nova.virt.libvirt.driver [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Deleting instance files /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5_del
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.090 186993 INFO nova.virt.libvirt.driver [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Deletion of /var/lib/nova/instances/e71a25fc-84a8-47f2-9cd4-00f608ed48a5_del complete
Dec 10 10:27:01 compute-0 podman[217408]: 2025-12-10 10:27:01.114912094 +0000 UTC m=+0.046814978 container remove d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.124 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[296e87fb-5958-4aa9-bc1b-84c84edc63f1]: (4, ('Wed Dec 10 10:27:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 (d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d)\nd2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d\nWed Dec 10 10:27:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 (d2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d)\nd2d05dceea97410665d04341d586a4c45c130b17295630d59e43ffa944778a6d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.125 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd39503-47c8-4b0c-9c0f-94cc045486bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.126 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6c2a93d-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.128 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:01 compute-0 kernel: tapf6c2a93d-70: left promiscuous mode
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.138 186993 INFO nova.compute.manager [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Took 0.33 seconds to destroy the instance on the hypervisor.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.139 186993 DEBUG oslo.service.loopingcall [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.139 186993 DEBUG nova.compute.manager [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.140 186993 DEBUG nova.network.neutron [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.143 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.143 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[28f81ca7-dc2b-4b0b-812e-a95027e1de67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.153 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[07a71a0f-0415-4dd9-ac87-4b7b34c7a59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.155 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[091191a5-ac57-4f3e-9e5e-1e798b7ab7af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.172 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[010cc003-f6f8-4f04-92d8-31ea6d370f4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343426, 'reachable_time': 36333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217424, 'error': None, 'target': 'ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.175 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6c2a93d-7c6f-44cf-a6a8-2dc701b77fb6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:27:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:01.175 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[047b380a-5bf7-474a-8328-6a2f951c7058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:01 compute-0 systemd[1]: run-netns-ovnmeta\x2df6c2a93d\x2d7c6f\x2d44cf\x2da6a8\x2d2dc701b77fb6.mount: Deactivated successfully.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.232 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.232 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.232 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.233 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.233 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] No waiting events found dispatching network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.233 186993 WARNING nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received unexpected event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with vm_state active and task_state deleting.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.233 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.233 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.234 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.234 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.234 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] No waiting events found dispatching network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.234 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-vif-unplugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.235 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.236 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.237 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.237 186993 DEBUG oslo_concurrency.lockutils [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.237 186993 DEBUG nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] No waiting events found dispatching network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.238 186993 WARNING nova.compute.manager [req-ba10216f-180f-4c0c-9ee4-fe036d1bb7b8 req-052638d5-3ef1-40ec-b8c4-a8ea84ff6bff 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Received unexpected event network-vif-plugged-23c0526d-3a07-4045-b515-64f7987d3bec for instance with vm_state active and task_state deleting.
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.975 186993 DEBUG nova.network.neutron [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:27:01 compute-0 nova_compute[186989]: 2025-12-10 10:27:01.992 186993 INFO nova.compute.manager [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Took 0.85 seconds to deallocate network for instance.
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.038 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.038 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.091 186993 DEBUG nova.compute.provider_tree [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.107 186993 DEBUG nova.scheduler.client.report [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.135 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.163 186993 INFO nova.scheduler.client.report [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance e71a25fc-84a8-47f2-9cd4-00f608ed48a5
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.284 186993 DEBUG oslo_concurrency.lockutils [None req-e2673d1f-1c6d-4d48-bd7e-d47e27ee6c39 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "e71a25fc-84a8-47f2-9cd4-00f608ed48a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:02 compute-0 nova_compute[186989]: 2025-12-10 10:27:02.671 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:06 compute-0 nova_compute[186989]: 2025-12-10 10:27:06.086 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:07 compute-0 nova_compute[186989]: 2025-12-10 10:27:07.674 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:10 compute-0 podman[217425]: 2025-12-10 10:27:10.023886515 +0000 UTC m=+0.064939143 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 10 10:27:10 compute-0 nova_compute[186989]: 2025-12-10 10:27:10.615 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:10 compute-0 nova_compute[186989]: 2025-12-10 10:27:10.697 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:11 compute-0 nova_compute[186989]: 2025-12-10 10:27:11.088 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:12 compute-0 podman[217447]: 2025-12-10 10:27:12.057792633 +0000 UTC m=+0.095611230 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:27:12 compute-0 nova_compute[186989]: 2025-12-10 10:27:12.676 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:16 compute-0 nova_compute[186989]: 2025-12-10 10:27:16.065 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362421.0644622, e71a25fc-84a8-47f2-9cd4-00f608ed48a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:27:16 compute-0 nova_compute[186989]: 2025-12-10 10:27:16.066 186993 INFO nova.compute.manager [-] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] VM Stopped (Lifecycle Event)
Dec 10 10:27:16 compute-0 nova_compute[186989]: 2025-12-10 10:27:16.090 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:16 compute-0 nova_compute[186989]: 2025-12-10 10:27:16.189 186993 DEBUG nova.compute.manager [None req-b9e8fa78-d1c7-48e3-8131-f3ebb8a35f37 - - - - - -] [instance: e71a25fc-84a8-47f2-9cd4-00f608ed48a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:17 compute-0 nova_compute[186989]: 2025-12-10 10:27:17.715 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:21 compute-0 nova_compute[186989]: 2025-12-10 10:27:21.092 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:22 compute-0 nova_compute[186989]: 2025-12-10 10:27:22.717 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:24 compute-0 podman[217472]: 2025-12-10 10:27:24.012813038 +0000 UTC m=+0.059817863 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:27:26 compute-0 podman[217496]: 2025-12-10 10:27:26.031939474 +0000 UTC m=+0.080911600 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 10 10:27:26 compute-0 nova_compute[186989]: 2025-12-10 10:27:26.134 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:27 compute-0 nova_compute[186989]: 2025-12-10 10:27:27.719 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.752 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.752 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.769 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.845 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.845 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.854 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.855 186993 INFO nova.compute.claims [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.952 186993 DEBUG nova.compute.provider_tree [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.966 186993 DEBUG nova.scheduler.client.report [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.986 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:29 compute-0 nova_compute[186989]: 2025-12-10 10:27:29.987 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.030 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.030 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.056 186993 INFO nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.082 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.157 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.159 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.159 186993 INFO nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Creating image(s)
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.160 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.161 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.162 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.179 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.241 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.242 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.243 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.254 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.309 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.311 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:30 compute-0 nova_compute[186989]: 2025-12-10 10:27:30.499 186993 DEBUG nova.policy [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.080 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Successfully created port: 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.138 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.143 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk 1073741824" returned: 0 in 0.833s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.144 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.145 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.223 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.224 186993 DEBUG nova.virt.disk.api [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.225 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.303 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.304 186993 DEBUG nova.virt.disk.api [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.305 186993 DEBUG nova.objects.instance [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 38561f38-3869-400d-9dc3-5f37104822d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.317 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.318 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Ensure instance console log exists: /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.318 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.319 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:31 compute-0 nova_compute[186989]: 2025-12-10 10:27:31.319 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:31.470 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:31.470 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:31.471 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:32 compute-0 podman[217531]: 2025-12-10 10:27:32.014142278 +0000 UTC m=+0.056456111 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=multipathd, io.buildah.version=1.41.3)
Dec 10 10:27:32 compute-0 podman[217530]: 2025-12-10 10:27:32.014547639 +0000 UTC m=+0.059754461 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:27:32 compute-0 podman[217532]: 2025-12-10 10:27:32.048826774 +0000 UTC m=+0.085380800 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller)
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.090 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Successfully updated port: 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.108 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.109 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.109 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.175 186993 DEBUG nova.compute.manager [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.176 186993 DEBUG nova.compute.manager [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing instance network info cache due to event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.176 186993 DEBUG oslo_concurrency.lockutils [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.466 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:27:32 compute-0 nova_compute[186989]: 2025-12-10 10:27:32.722 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.018 186993 DEBUG nova.network.neutron [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.050 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.051 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance network_info: |[{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.051 186993 DEBUG oslo_concurrency.lockutils [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.051 186993 DEBUG nova.network.neutron [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.054 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Start _get_guest_xml network_info=[{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.058 186993 WARNING nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.066 186993 DEBUG nova.virt.libvirt.host [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.068 186993 DEBUG nova.virt.libvirt.host [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.072 186993 DEBUG nova.virt.libvirt.host [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.073 186993 DEBUG nova.virt.libvirt.host [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.074 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.075 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.075 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.075 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.076 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.076 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.076 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.076 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.076 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.077 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.077 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.077 186993 DEBUG nova.virt.hardware [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.081 186993 DEBUG nova.virt.libvirt.vif [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:27:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1170012183',display_name='tempest-TestNetworkBasicOps-server-1170012183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1170012183',id=10,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBD/S3MIFhjDQcWP5bkFZil9XFrdHwleRoVAm3/szFvAPURRQt99tCposeAWvlwKUUKUdalnVmN/Owljn+HBXB8AlfxpoM9ULxL+k3ARcoiIo7vqROMnc9hVo0A5lNhSxA==',key_name='tempest-TestNetworkBasicOps-1362130893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-1pbo4v0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:27:30Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=38561f38-3869-400d-9dc3-5f37104822d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.081 186993 DEBUG nova.network.os_vif_util [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.082 186993 DEBUG nova.network.os_vif_util [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.083 186993 DEBUG nova.objects.instance [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 38561f38-3869-400d-9dc3-5f37104822d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.100 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <uuid>38561f38-3869-400d-9dc3-5f37104822d0</uuid>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <name>instance-0000000a</name>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-1170012183</nova:name>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:27:33</nova:creationTime>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         <nova:port uuid="5d0a0d4a-d7f5-46ea-b982-c33879f6e687">
Dec 10 10:27:33 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <system>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="serial">38561f38-3869-400d-9dc3-5f37104822d0</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="uuid">38561f38-3869-400d-9dc3-5f37104822d0</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </system>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <os>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </os>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <features>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </features>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.config"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:c3:95:bd"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <target dev="tap5d0a0d4a-d7"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/console.log" append="off"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <video>
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </video>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:27:33 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:27:33 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:27:33 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:27:33 compute-0 nova_compute[186989]: </domain>
Dec 10 10:27:33 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.102 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Preparing to wait for external event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.102 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.103 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.103 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.104 186993 DEBUG nova.virt.libvirt.vif [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:27:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1170012183',display_name='tempest-TestNetworkBasicOps-server-1170012183',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1170012183',id=10,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBD/S3MIFhjDQcWP5bkFZil9XFrdHwleRoVAm3/szFvAPURRQt99tCposeAWvlwKUUKUdalnVmN/Owljn+HBXB8AlfxpoM9ULxL+k3ARcoiIo7vqROMnc9hVo0A5lNhSxA==',key_name='tempest-TestNetworkBasicOps-1362130893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-1pbo4v0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:27:30Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=38561f38-3869-400d-9dc3-5f37104822d0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.104 186993 DEBUG nova.network.os_vif_util [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.105 186993 DEBUG nova.network.os_vif_util [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.105 186993 DEBUG os_vif [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.107 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.108 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.108 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.112 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.113 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d0a0d4a-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.113 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5d0a0d4a-d7, col_values=(('external_ids', {'iface-id': '5d0a0d4a-d7f5-46ea-b982-c33879f6e687', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:95:bd', 'vm-uuid': '38561f38-3869-400d-9dc3-5f37104822d0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.115 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 NetworkManager[55541]: <info>  [1765362453.1166] manager: (tap5d0a0d4a-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.118 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.122 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.123 186993 INFO os_vif [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7')
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.170 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.172 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.172 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:c3:95:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.173 186993 INFO nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Using config drive
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.690 186993 INFO nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Creating config drive at /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.config
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.701 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3n7xd64e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.840 186993 DEBUG oslo_concurrency.processutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3n7xd64e" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:27:33 compute-0 kernel: tap5d0a0d4a-d7: entered promiscuous mode
Dec 10 10:27:33 compute-0 ovn_controller[95452]: 2025-12-10T10:27:33Z|00129|binding|INFO|Claiming lport 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 for this chassis.
Dec 10 10:27:33 compute-0 ovn_controller[95452]: 2025-12-10T10:27:33Z|00130|binding|INFO|5d0a0d4a-d7f5-46ea-b982-c33879f6e687: Claiming fa:16:3e:c3:95:bd 10.100.0.3
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.923 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 NetworkManager[55541]: <info>  [1765362453.9255] manager: (tap5d0a0d4a-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.938 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:95:bd 10.100.0.3'], port_security=['fa:16:3e:c3:95:bd 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '38561f38-3869-400d-9dc3-5f37104822d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7f976492-1780-4558-bd8e-ef39147dbb4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=702b933d-d9ea-4198-bea5-db427ce1063b, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=5d0a0d4a-d7f5-46ea-b982-c33879f6e687) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.940 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 in datapath 41ae0339-dff6-4fe8-8447-0f930c6e18b6 bound to our chassis
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.942 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41ae0339-dff6-4fe8-8447-0f930c6e18b6
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.957 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fc00934a-c783-4987-ba87-a359a000f9ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.958 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41ae0339-d1 in ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.960 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41ae0339-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.961 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[180acf2a-60b9-41b5-903d-0975d67e23cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.962 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bc77e6ec-800b-4d57-afd4-07cb0a656baf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.979 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 systemd-machined[153379]: New machine qemu-10-instance-0000000a.
Dec 10 10:27:33 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:33.983 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4055d2-c4c2-46fb-b0a5-34be6ebd2ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:33 compute-0 ovn_controller[95452]: 2025-12-10T10:27:33Z|00131|binding|INFO|Setting lport 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 ovn-installed in OVS
Dec 10 10:27:33 compute-0 ovn_controller[95452]: 2025-12-10T10:27:33Z|00132|binding|INFO|Setting lport 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 up in Southbound
Dec 10 10:27:33 compute-0 nova_compute[186989]: 2025-12-10 10:27:33.988 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:33 compute-0 systemd-udevd[217612]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:27:33 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Dec 10 10:27:34 compute-0 NetworkManager[55541]: <info>  [1765362454.0047] device (tap5d0a0d4a-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:27:34 compute-0 NetworkManager[55541]: <info>  [1765362454.0054] device (tap5d0a0d4a-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.005 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9ceaf7-e417-40f7-b080-83b234b4bc8d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.039 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0305fc-d025-4347-a5b2-cfdd257e7863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 NetworkManager[55541]: <info>  [1765362454.0452] manager: (tap41ae0339-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.044 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[31d0de16-f30e-4704-b76f-f44903c65a01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.074 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[3367714e-1854-4ea8-8ded-c20b333e702b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.078 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[0c18ca8a-b15f-49fc-ac4f-6775e4d104b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 NetworkManager[55541]: <info>  [1765362454.1010] device (tap41ae0339-d0): carrier: link connected
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.107 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[d8662156-330e-42e3-ae7a-6afa4df14b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.124 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6410fa28-6b9a-491a-b200-94b82c31d7e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41ae0339-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:67:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 346931, 'reachable_time': 41861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217643, 'error': None, 'target': 'ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.138 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fb61d37f-ddc2-4118-aad9-cd4ac2d6da69]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:67f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 346931, 'tstamp': 346931}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217644, 'error': None, 'target': 'ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.152 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c46fd2d0-553f-4b3b-8135-e915de1aa64d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41ae0339-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:67:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 346931, 'reachable_time': 41861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217645, 'error': None, 'target': 'ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.184 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f08c6584-6518-4dbd-9cb8-04b8ca0efe34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.242 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9b87ac6e-d523-4a3e-8cd2-47e741f97dd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.244 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41ae0339-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.244 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.245 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41ae0339-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.252 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:34 compute-0 kernel: tap41ae0339-d0: entered promiscuous mode
Dec 10 10:27:34 compute-0 NetworkManager[55541]: <info>  [1765362454.2529] manager: (tap41ae0339-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.254 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.255 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41ae0339-d0, col_values=(('external_ids', {'iface-id': '24a03d92-ddf6-4c58-b2de-4a88b1197dcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.256 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:34 compute-0 ovn_controller[95452]: 2025-12-10T10:27:34Z|00133|binding|INFO|Releasing lport 24a03d92-ddf6-4c58-b2de-4a88b1197dcf from this chassis (sb_readonly=0)
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.269 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.270 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41ae0339-dff6-4fe8-8447-0f930c6e18b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41ae0339-dff6-4fe8-8447-0f930c6e18b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.271 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[1624ed52-730a-49cc-9d41-aeb56fff2dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.272 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-41ae0339-dff6-4fe8-8447-0f930c6e18b6
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/41ae0339-dff6-4fe8-8447-0f930c6e18b6.pid.haproxy
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 41ae0339-dff6-4fe8-8447-0f930c6e18b6
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:27:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:27:34.273 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'env', 'PROCESS_TAG=haproxy-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41ae0339-dff6-4fe8-8447-0f930c6e18b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.436 186993 DEBUG nova.compute.manager [req-1c0936b3-9ad8-437b-8c0d-ff72fc41671f req-42061d39-c038-4fed-85e1-8a0c5f66598f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.436 186993 DEBUG oslo_concurrency.lockutils [req-1c0936b3-9ad8-437b-8c0d-ff72fc41671f req-42061d39-c038-4fed-85e1-8a0c5f66598f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.437 186993 DEBUG oslo_concurrency.lockutils [req-1c0936b3-9ad8-437b-8c0d-ff72fc41671f req-42061d39-c038-4fed-85e1-8a0c5f66598f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.437 186993 DEBUG oslo_concurrency.lockutils [req-1c0936b3-9ad8-437b-8c0d-ff72fc41671f req-42061d39-c038-4fed-85e1-8a0c5f66598f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.438 186993 DEBUG nova.compute.manager [req-1c0936b3-9ad8-437b-8c0d-ff72fc41671f req-42061d39-c038-4fed-85e1-8a0c5f66598f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Processing event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.571 186993 DEBUG nova.network.neutron [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updated VIF entry in instance network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.572 186993 DEBUG nova.network.neutron [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.589 186993 DEBUG oslo_concurrency.lockutils [req-bf75a8c2-8929-40d7-b528-d35127fb2ae9 req-7f52dfdc-b297-4298-8444-c99ecf658dfc 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.666 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.667 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362454.6654084, 38561f38-3869-400d-9dc3-5f37104822d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.667 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] VM Started (Lifecycle Event)
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.671 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.675 186993 INFO nova.virt.libvirt.driver [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance spawned successfully.
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.676 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.690 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.696 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.700 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.701 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.701 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.702 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.702 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.703 186993 DEBUG nova.virt.libvirt.driver [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.712 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.713 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362454.6657617, 38561f38-3869-400d-9dc3-5f37104822d0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.713 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] VM Paused (Lifecycle Event)
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.734 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.739 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362454.671028, 38561f38-3869-400d-9dc3-5f37104822d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.739 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] VM Resumed (Lifecycle Event)
Dec 10 10:27:34 compute-0 podman[217683]: 2025-12-10 10:27:34.649004702 +0000 UTC m=+0.027292275 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.757 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.763 186993 INFO nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Took 4.61 seconds to spawn the instance on the hypervisor.
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.763 186993 DEBUG nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.764 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.798 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.836 186993 INFO nova.compute.manager [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Took 5.03 seconds to build instance.
Dec 10 10:27:34 compute-0 nova_compute[186989]: 2025-12-10 10:27:34.852 186993 DEBUG oslo_concurrency.lockutils [None req-9ba04233-304a-4b0a-a0a8-58949413f22a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:35 compute-0 podman[217683]: 2025-12-10 10:27:35.357181202 +0000 UTC m=+0.735468765 container create f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 10 10:27:35 compute-0 systemd[1]: Started libpod-conmon-f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9.scope.
Dec 10 10:27:35 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:27:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ce3afe87d3acfcdde898aa6b35a8b6bd4f1ef9c5c392b95b64b5c6d5bca36d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:27:35 compute-0 podman[217683]: 2025-12-10 10:27:35.440548946 +0000 UTC m=+0.818836539 container init f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 10 10:27:35 compute-0 podman[217683]: 2025-12-10 10:27:35.446683614 +0000 UTC m=+0.824971177 container start f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:27:35 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [NOTICE]   (217704) : New worker (217706) forked
Dec 10 10:27:35 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [NOTICE]   (217704) : Loading success.
Dec 10 10:27:36 compute-0 ovn_controller[95452]: 2025-12-10T10:27:36Z|00134|binding|INFO|Releasing lport 24a03d92-ddf6-4c58-b2de-4a88b1197dcf from this chassis (sb_readonly=0)
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.454 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:36 compute-0 NetworkManager[55541]: <info>  [1765362456.4552] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec 10 10:27:36 compute-0 NetworkManager[55541]: <info>  [1765362456.4559] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 10 10:27:36 compute-0 ovn_controller[95452]: 2025-12-10T10:27:36Z|00135|binding|INFO|Releasing lport 24a03d92-ddf6-4c58-b2de-4a88b1197dcf from this chassis (sb_readonly=0)
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.512 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.521 186993 DEBUG nova.compute.manager [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.521 186993 DEBUG oslo_concurrency.lockutils [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.522 186993 DEBUG oslo_concurrency.lockutils [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.522 186993 DEBUG oslo_concurrency.lockutils [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.522 186993 DEBUG nova.compute.manager [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] No waiting events found dispatching network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.522 186993 WARNING nova.compute.manager [req-a591d27c-3eb7-4097-a20d-422985f5cd9b req-09e5a771-c1bc-4fe1-b7d5-7a1ba8a61a02 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received unexpected event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 for instance with vm_state active and task_state None.
Dec 10 10:27:36 compute-0 nova_compute[186989]: 2025-12-10 10:27:36.523 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:37 compute-0 nova_compute[186989]: 2025-12-10 10:27:37.779 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.116 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.613 186993 DEBUG nova.compute.manager [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.614 186993 DEBUG nova.compute.manager [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing instance network info cache due to event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.615 186993 DEBUG oslo_concurrency.lockutils [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.615 186993 DEBUG oslo_concurrency.lockutils [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:27:38 compute-0 nova_compute[186989]: 2025-12-10 10:27:38.615 186993 DEBUG nova.network.neutron [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:27:41 compute-0 podman[217716]: 2025-12-10 10:27:41.02681372 +0000 UTC m=+0.066161676 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 10 10:27:41 compute-0 nova_compute[186989]: 2025-12-10 10:27:41.493 186993 DEBUG nova.network.neutron [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updated VIF entry in instance network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:27:41 compute-0 nova_compute[186989]: 2025-12-10 10:27:41.493 186993 DEBUG nova.network.neutron [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:27:41 compute-0 nova_compute[186989]: 2025-12-10 10:27:41.803 186993 DEBUG oslo_concurrency.lockutils [req-36aae7a9-7ec2-435c-a4f3-04d0f12999bc req-d334b8ac-0865-4716-bb24-9c5d0f2648bd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:27:42 compute-0 nova_compute[186989]: 2025-12-10 10:27:42.782 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:43 compute-0 podman[217737]: 2025-12-10 10:27:43.018936079 +0000 UTC m=+0.063117483 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:27:43 compute-0 nova_compute[186989]: 2025-12-10 10:27:43.119 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:46 compute-0 ovn_controller[95452]: 2025-12-10T10:27:46Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c3:95:bd 10.100.0.3
Dec 10 10:27:46 compute-0 ovn_controller[95452]: 2025-12-10T10:27:46Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:95:bd 10.100.0.3
Dec 10 10:27:47 compute-0 nova_compute[186989]: 2025-12-10 10:27:47.825 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:47 compute-0 nova_compute[186989]: 2025-12-10 10:27:47.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:48 compute-0 nova_compute[186989]: 2025-12-10 10:27:48.121 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:48 compute-0 nova_compute[186989]: 2025-12-10 10:27:48.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:52 compute-0 nova_compute[186989]: 2025-12-10 10:27:52.828 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:52 compute-0 nova_compute[186989]: 2025-12-10 10:27:52.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:52 compute-0 nova_compute[186989]: 2025-12-10 10:27:52.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:52 compute-0 nova_compute[186989]: 2025-12-10 10:27:52.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:52 compute-0 nova_compute[186989]: 2025-12-10 10:27:52.923 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:27:53 compute-0 nova_compute[186989]: 2025-12-10 10:27:53.124 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:53 compute-0 nova_compute[186989]: 2025-12-10 10:27:53.277 186993 INFO nova.compute.manager [None req-14ad7acf-df2a-484b-b28b-0ddaa99995a6 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Get console output
Dec 10 10:27:53 compute-0 nova_compute[186989]: 2025-12-10 10:27:53.284 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:27:53 compute-0 nova_compute[186989]: 2025-12-10 10:27:53.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:54 compute-0 ovn_controller[95452]: 2025-12-10T10:27:54Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:95:bd 10.100.0.3
Dec 10 10:27:54 compute-0 nova_compute[186989]: 2025-12-10 10:27:54.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:54 compute-0 nova_compute[186989]: 2025-12-10 10:27:54.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:27:54 compute-0 nova_compute[186989]: 2025-12-10 10:27:54.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:27:55 compute-0 podman[217769]: 2025-12-10 10:27:55.005370859 +0000 UTC m=+0.051904117 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:27:55 compute-0 nova_compute[186989]: 2025-12-10 10:27:55.485 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:27:55 compute-0 nova_compute[186989]: 2025-12-10 10:27:55.486 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:27:55 compute-0 nova_compute[186989]: 2025-12-10 10:27:55.486 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:27:55 compute-0 nova_compute[186989]: 2025-12-10 10:27:55.487 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 38561f38-3869-400d-9dc3-5f37104822d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:27:57 compute-0 podman[217793]: 2025-12-10 10:27:57.007612794 +0000 UTC m=+0.056100861 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:27:57 compute-0 nova_compute[186989]: 2025-12-10 10:27:57.511 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:27:57 compute-0 nova_compute[186989]: 2025-12-10 10:27:57.832 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:58 compute-0 nova_compute[186989]: 2025-12-10 10:27:58.164 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:27:58 compute-0 nova_compute[186989]: 2025-12-10 10:27:58.578 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:27:58 compute-0 nova_compute[186989]: 2025-12-10 10:27:58.579 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:27:58 compute-0 nova_compute[186989]: 2025-12-10 10:27:58.579 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:58 compute-0 nova_compute[186989]: 2025-12-10 10:27:58.579 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:27:58 compute-0 ovn_controller[95452]: 2025-12-10T10:27:58Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:95:bd 10.100.0.3
Dec 10 10:27:59 compute-0 nova_compute[186989]: 2025-12-10 10:27:59.147 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:27:59 compute-0 nova_compute[186989]: 2025-12-10 10:27:59.147 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:27:59 compute-0 nova_compute[186989]: 2025-12-10 10:27:59.148 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:27:59 compute-0 nova_compute[186989]: 2025-12-10 10:27:59.148 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:27:59 compute-0 nova_compute[186989]: 2025-12-10 10:27:59.940 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.009 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.010 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.073 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.275 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.277 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=73.30130386352539GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.278 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.278 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.376 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 38561f38-3869-400d-9dc3-5f37104822d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.376 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.377 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.421 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.439 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.465 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.465 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.751 186993 DEBUG nova.compute.manager [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.752 186993 DEBUG nova.compute.manager [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing instance network info cache due to event network-changed-5d0a0d4a-d7f5-46ea-b982-c33879f6e687. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.752 186993 DEBUG oslo_concurrency.lockutils [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.753 186993 DEBUG oslo_concurrency.lockutils [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.753 186993 DEBUG nova.network.neutron [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Refreshing network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.856 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.857 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.857 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.857 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.857 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.859 186993 INFO nova.compute.manager [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Terminating instance
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.859 186993 DEBUG nova.compute.manager [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:28:00 compute-0 kernel: tap5d0a0d4a-d7 (unregistering): left promiscuous mode
Dec 10 10:28:00 compute-0 NetworkManager[55541]: <info>  [1765362480.8770] device (tap5d0a0d4a-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:28:00 compute-0 ovn_controller[95452]: 2025-12-10T10:28:00Z|00136|binding|INFO|Releasing lport 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 from this chassis (sb_readonly=0)
Dec 10 10:28:00 compute-0 ovn_controller[95452]: 2025-12-10T10:28:00Z|00137|binding|INFO|Setting lport 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 down in Southbound
Dec 10 10:28:00 compute-0 ovn_controller[95452]: 2025-12-10T10:28:00Z|00138|binding|INFO|Removing iface tap5d0a0d4a-d7 ovn-installed in OVS
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.885 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:00 compute-0 nova_compute[186989]: 2025-12-10 10:28:00.907 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:00.912 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:95:bd 10.100.0.3'], port_security=['fa:16:3e:c3:95:bd 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '38561f38-3869-400d-9dc3-5f37104822d0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f976492-1780-4558-bd8e-ef39147dbb4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=702b933d-d9ea-4198-bea5-db427ce1063b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=5d0a0d4a-d7f5-46ea-b982-c33879f6e687) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:28:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:00.915 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687 in datapath 41ae0339-dff6-4fe8-8447-0f930c6e18b6 unbound from our chassis
Dec 10 10:28:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:00.916 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41ae0339-dff6-4fe8-8447-0f930c6e18b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:28:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:00.917 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[edc9c88d-33aa-4a23-8665-608ad18b7dc5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:00 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:00.918 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6 namespace which is not needed anymore
Dec 10 10:28:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 10 10:28:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 14.172s CPU time.
Dec 10 10:28:00 compute-0 systemd-machined[153379]: Machine qemu-10-instance-0000000a terminated.
Dec 10 10:28:01 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [NOTICE]   (217704) : haproxy version is 2.8.14-c23fe91
Dec 10 10:28:01 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [NOTICE]   (217704) : path to executable is /usr/sbin/haproxy
Dec 10 10:28:01 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [WARNING]  (217704) : Exiting Master process...
Dec 10 10:28:01 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [ALERT]    (217704) : Current worker (217706) exited with code 143 (Terminated)
Dec 10 10:28:01 compute-0 neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6[217700]: [WARNING]  (217704) : All workers exited. Exiting... (0)
Dec 10 10:28:01 compute-0 systemd[1]: libpod-f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9.scope: Deactivated successfully.
Dec 10 10:28:01 compute-0 podman[217844]: 2025-12-10 10:28:01.127216903 +0000 UTC m=+0.090755816 container died f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.129 186993 INFO nova.virt.libvirt.driver [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance destroyed successfully.
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.131 186993 DEBUG nova.objects.instance [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 38561f38-3869-400d-9dc3-5f37104822d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9-userdata-shm.mount: Deactivated successfully.
Dec 10 10:28:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ce3afe87d3acfcdde898aa6b35a8b6bd4f1ef9c5c392b95b64b5c6d5bca36d2-merged.mount: Deactivated successfully.
Dec 10 10:28:01 compute-0 podman[217844]: 2025-12-10 10:28:01.185847841 +0000 UTC m=+0.149386734 container cleanup f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.200 186993 DEBUG nova.virt.libvirt.vif [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:27:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1170012183',display_name='tempest-TestNetworkBasicOps-server-1170012183',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1170012183',id=10,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBD/S3MIFhjDQcWP5bkFZil9XFrdHwleRoVAm3/szFvAPURRQt99tCposeAWvlwKUUKUdalnVmN/Owljn+HBXB8AlfxpoM9ULxL+k3ARcoiIo7vqROMnc9hVo0A5lNhSxA==',key_name='tempest-TestNetworkBasicOps-1362130893',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:27:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-1pbo4v0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:27:34Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=38561f38-3869-400d-9dc3-5f37104822d0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.201 186993 DEBUG nova.network.os_vif_util [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.202 186993 DEBUG nova.network.os_vif_util [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.202 186993 DEBUG os_vif [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.205 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.205 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d0a0d4a-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.207 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.208 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:01 compute-0 systemd[1]: libpod-conmon-f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9.scope: Deactivated successfully.
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.211 186993 INFO os_vif [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:95:bd,bridge_name='br-int',has_traffic_filtering=True,id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687,network=Network(41ae0339-dff6-4fe8-8447-0f930c6e18b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5d0a0d4a-d7')
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.212 186993 INFO nova.virt.libvirt.driver [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Deleting instance files /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0_del
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.213 186993 INFO nova.virt.libvirt.driver [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Deletion of /var/lib/nova/instances/38561f38-3869-400d-9dc3-5f37104822d0_del complete
Dec 10 10:28:01 compute-0 podman[217889]: 2025-12-10 10:28:01.30500887 +0000 UTC m=+0.094854869 container remove f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.311 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[af7a8814-9f1e-4360-8af7-2664f9e6784c]: (4, ('Wed Dec 10 10:28:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6 (f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9)\nf595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9\nWed Dec 10 10:28:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6 (f595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9)\nf595b6fc26c8a462a526485f358fc06d57c0655a714aee24d1be29e8a7e51cf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.313 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[1c34a947-21a8-4ed4-84a0-e7fe95eca976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.314 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41ae0339-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.316 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:01 compute-0 kernel: tap41ae0339-d0: left promiscuous mode
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.329 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.331 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[58eac669-dc3c-4f23-bb22-899f667a41aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.343 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[516f2d8d-bc5a-4b8c-92c2-6b99a13ac617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.345 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[cf70853a-a1fe-4d65-aa84-ae64ad2abd93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.362 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a3283a9a-7e6a-4993-8a24-0dfbe02a9c6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 346925, 'reachable_time': 44508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217904, 'error': None, 'target': 'ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.365 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41ae0339-dff6-4fe8-8447-0f930c6e18b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:28:01 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:01.366 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7f1fda-c01e-474f-9056-3d6f98362956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d41ae0339\x2ddff6\x2d4fe8\x2d8447\x2d0f930c6e18b6.mount: Deactivated successfully.
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.426 186993 INFO nova.compute.manager [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Took 0.57 seconds to destroy the instance on the hypervisor.
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.428 186993 DEBUG oslo.service.loopingcall [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.428 186993 DEBUG nova.compute.manager [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.429 186993 DEBUG nova.network.neutron [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:28:01 compute-0 nova_compute[186989]: 2025-12-10 10:28:01.460 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.832 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.853 186993 DEBUG nova.compute.manager [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-unplugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.854 186993 DEBUG oslo_concurrency.lockutils [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.855 186993 DEBUG oslo_concurrency.lockutils [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.855 186993 DEBUG oslo_concurrency.lockutils [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.856 186993 DEBUG nova.compute.manager [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] No waiting events found dispatching network-vif-unplugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:28:02 compute-0 nova_compute[186989]: 2025-12-10 10:28:02.857 186993 DEBUG nova.compute.manager [req-56c38583-7527-4f5b-98ff-fe20ea9c122a req-86117567-b9f0-4e2b-9dac-1c7a85af59cd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-unplugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:28:03 compute-0 podman[217906]: 2025-12-10 10:28:03.026844849 +0000 UTC m=+0.060399218 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 10 10:28:03 compute-0 podman[217905]: 2025-12-10 10:28:03.032625278 +0000 UTC m=+0.071249457 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 10 10:28:03 compute-0 podman[217907]: 2025-12-10 10:28:03.049452232 +0000 UTC m=+0.083607728 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.117 186993 DEBUG nova.compute.manager [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.118 186993 DEBUG oslo_concurrency.lockutils [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "38561f38-3869-400d-9dc3-5f37104822d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.119 186993 DEBUG oslo_concurrency.lockutils [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.119 186993 DEBUG oslo_concurrency.lockutils [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.120 186993 DEBUG nova.compute.manager [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] No waiting events found dispatching network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.120 186993 WARNING nova.compute.manager [req-054e8e14-42e9-437f-a418-624bbb9968a1 req-d31ec0b4-034e-4998-b13f-9ffc98951082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received unexpected event network-vif-plugged-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 for instance with vm_state active and task_state deleting.
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.158 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:05 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:05.158 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:28:05 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:05.160 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.503 186993 DEBUG nova.network.neutron [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.507 186993 DEBUG nova.network.neutron [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updated VIF entry in instance network info cache for port 5d0a0d4a-d7f5-46ea-b982-c33879f6e687. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.507 186993 DEBUG nova.network.neutron [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Updating instance_info_cache with network_info: [{"id": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "address": "fa:16:3e:c3:95:bd", "network": {"id": "41ae0339-dff6-4fe8-8447-0f930c6e18b6", "bridge": "br-int", "label": "tempest-network-smoke--1145886993", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5d0a0d4a-d7", "ovs_interfaceid": "5d0a0d4a-d7f5-46ea-b982-c33879f6e687", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.523 186993 INFO nova.compute.manager [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Took 4.09 seconds to deallocate network for instance.
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.530 186993 DEBUG oslo_concurrency.lockutils [req-78e837c6-8c7c-45a3-9989-6e92f51b6d20 req-a7af9169-a1a9-46ad-8637-c04b1d9c7bdd 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-38561f38-3869-400d-9dc3-5f37104822d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.562 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.563 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.636 186993 DEBUG nova.compute.provider_tree [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.653 186993 DEBUG nova.scheduler.client.report [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.680 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.726 186993 INFO nova.scheduler.client.report [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 38561f38-3869-400d-9dc3-5f37104822d0
Dec 10 10:28:05 compute-0 nova_compute[186989]: 2025-12-10 10:28:05.784 186993 DEBUG oslo_concurrency.lockutils [None req-27302be8-57de-4f2c-9df4-8c993ced677b 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "38561f38-3869-400d-9dc3-5f37104822d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:06 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:06.162 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:06 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:28:06 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 10 10:28:06 compute-0 nova_compute[186989]: 2025-12-10 10:28:06.209 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:07 compute-0 nova_compute[186989]: 2025-12-10 10:28:07.241 186993 DEBUG nova.compute.manager [req-5a8901f5-5033-403c-b8ce-bb9173e207ea req-95e60729-c1b6-4c8f-bd30-77a9c5c09ecf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Received event network-vif-deleted-5d0a0d4a-d7f5-46ea-b982-c33879f6e687 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:07 compute-0 nova_compute[186989]: 2025-12-10 10:28:07.241 186993 INFO nova.compute.manager [req-5a8901f5-5033-403c-b8ce-bb9173e207ea req-95e60729-c1b6-4c8f-bd30-77a9c5c09ecf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Neutron deleted interface 5d0a0d4a-d7f5-46ea-b982-c33879f6e687; detaching it from the instance and deleting it from the info cache
Dec 10 10:28:07 compute-0 nova_compute[186989]: 2025-12-10 10:28:07.242 186993 DEBUG nova.network.neutron [req-5a8901f5-5033-403c-b8ce-bb9173e207ea req-95e60729-c1b6-4c8f-bd30-77a9c5c09ecf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Dec 10 10:28:07 compute-0 nova_compute[186989]: 2025-12-10 10:28:07.245 186993 DEBUG nova.compute.manager [req-5a8901f5-5033-403c-b8ce-bb9173e207ea req-95e60729-c1b6-4c8f-bd30-77a9c5c09ecf 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Detach interface failed, port_id=5d0a0d4a-d7f5-46ea-b982-c33879f6e687, reason: Instance 38561f38-3869-400d-9dc3-5f37104822d0 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 10 10:28:07 compute-0 nova_compute[186989]: 2025-12-10 10:28:07.836 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:11 compute-0 nova_compute[186989]: 2025-12-10 10:28:11.213 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:11 compute-0 nova_compute[186989]: 2025-12-10 10:28:11.655 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:11 compute-0 nova_compute[186989]: 2025-12-10 10:28:11.746 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:12 compute-0 podman[217972]: 2025-12-10 10:28:12.025915431 +0000 UTC m=+0.065922080 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=)
Dec 10 10:28:12 compute-0 nova_compute[186989]: 2025-12-10 10:28:12.838 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:14 compute-0 podman[217993]: 2025-12-10 10:28:14.009200085 +0000 UTC m=+0.057197950 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:28:16 compute-0 nova_compute[186989]: 2025-12-10 10:28:16.127 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362481.1258075, 38561f38-3869-400d-9dc3-5f37104822d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:16 compute-0 nova_compute[186989]: 2025-12-10 10:28:16.128 186993 INFO nova.compute.manager [-] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] VM Stopped (Lifecycle Event)
Dec 10 10:28:16 compute-0 nova_compute[186989]: 2025-12-10 10:28:16.151 186993 DEBUG nova.compute.manager [None req-caf6fc3f-ebcf-44ba-81b0-b307baab1645 - - - - - -] [instance: 38561f38-3869-400d-9dc3-5f37104822d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:16 compute-0 nova_compute[186989]: 2025-12-10 10:28:16.216 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:17 compute-0 nova_compute[186989]: 2025-12-10 10:28:17.841 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:21 compute-0 nova_compute[186989]: 2025-12-10 10:28:21.219 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:22 compute-0 nova_compute[186989]: 2025-12-10 10:28:22.843 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:26 compute-0 podman[218017]: 2025-12-10 10:28:26.024032863 +0000 UTC m=+0.069498739 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:28:26 compute-0 nova_compute[186989]: 2025-12-10 10:28:26.222 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.161 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.162 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.260 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.354 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.355 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.367 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.367 186993 INFO nova.compute.claims [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.478 186993 DEBUG nova.compute.provider_tree [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.491 186993 DEBUG nova.scheduler.client.report [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.516 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.517 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.561 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.562 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.578 186993 INFO nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.595 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.702 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.707 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.708 186993 INFO nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Creating image(s)
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.709 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.710 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.712 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.740 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.809 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.810 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.811 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.826 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.845 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.883 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.884 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.924 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.925 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.926 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.990 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.992 186993 DEBUG nova.virt.disk.api [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:28:27 compute-0 nova_compute[186989]: 2025-12-10 10:28:27.993 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:28 compute-0 podman[218051]: 2025-12-10 10:28:28.020937152 +0000 UTC m=+0.063872194 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.081 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.083 186993 DEBUG nova.virt.disk.api [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.083 186993 DEBUG nova.objects.instance [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.108 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.109 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Ensure instance console log exists: /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.109 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.110 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.110 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:28 compute-0 nova_compute[186989]: 2025-12-10 10:28:28.545 186993 DEBUG nova.policy [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:28:30 compute-0 nova_compute[186989]: 2025-12-10 10:28:30.739 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Successfully created port: 4a577958-715d-4941-ad19-f7f8b1cf8586 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:28:31 compute-0 nova_compute[186989]: 2025-12-10 10:28:31.225 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:31.471 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:31.472 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:31.472 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.735 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Successfully updated port: 4a577958-715d-4941-ad19-f7f8b1cf8586 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.798 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.799 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.799 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.848 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.927 186993 DEBUG nova.compute.manager [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.928 186993 DEBUG nova.compute.manager [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing instance network info cache due to event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:28:32 compute-0 nova_compute[186989]: 2025-12-10 10:28:32.929 186993 DEBUG oslo_concurrency.lockutils [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:33 compute-0 nova_compute[186989]: 2025-12-10 10:28:33.509 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:28:34 compute-0 podman[218075]: 2025-12-10 10:28:34.071023958 +0000 UTC m=+0.113324739 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:28:34 compute-0 podman[218076]: 2025-12-10 10:28:34.089166198 +0000 UTC m=+0.127706565 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 10 10:28:34 compute-0 podman[218077]: 2025-12-10 10:28:34.095589085 +0000 UTC m=+0.125428682 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.820 186993 DEBUG nova.network.neutron [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.840 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.841 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Instance network_info: |[{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.841 186993 DEBUG oslo_concurrency.lockutils [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.841 186993 DEBUG nova.network.neutron [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.844 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Start _get_guest_xml network_info=[{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.850 186993 WARNING nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.858 186993 DEBUG nova.virt.libvirt.host [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.859 186993 DEBUG nova.virt.libvirt.host [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.864 186993 DEBUG nova.virt.libvirt.host [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.865 186993 DEBUG nova.virt.libvirt.host [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.865 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.865 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.866 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.866 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.866 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.867 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.867 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.867 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.867 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.868 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.868 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.868 186993 DEBUG nova.virt.hardware [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.872 186993 DEBUG nova.virt.libvirt.vif [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:28:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-73136063',display_name='tempest-TestNetworkBasicOps-server-73136063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-73136063',id=11,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBmKvYbT6bj4t57mDlrUSDmSjI3rzy8h8vceGx1Im7+MUXO41DcyBhm3Ct6eex1O+G5RLMqXn5sL8le7JyNPS5pNGQDvSbn/Ev+RDYQsNA6pKTEQis7SzRTupmQ5oaPFA==',key_name='tempest-TestNetworkBasicOps-1390815627',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-rzca3oo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:28:27Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=9d70dd0a-d1e7-4821-a30b-0f11f1440ae5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.872 186993 DEBUG nova.network.os_vif_util [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.873 186993 DEBUG nova.network.os_vif_util [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.873 186993 DEBUG nova.objects.instance [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.888 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <uuid>9d70dd0a-d1e7-4821-a30b-0f11f1440ae5</uuid>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <name>instance-0000000b</name>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-73136063</nova:name>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:28:34</nova:creationTime>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         <nova:port uuid="4a577958-715d-4941-ad19-f7f8b1cf8586">
Dec 10 10:28:34 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <system>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="serial">9d70dd0a-d1e7-4821-a30b-0f11f1440ae5</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="uuid">9d70dd0a-d1e7-4821-a30b-0f11f1440ae5</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </system>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <os>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </os>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <features>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </features>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.config"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:fb:ca:06"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <target dev="tap4a577958-71"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/console.log" append="off"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <video>
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </video>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:28:34 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:28:34 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:28:34 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:28:34 compute-0 nova_compute[186989]: </domain>
Dec 10 10:28:34 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.890 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Preparing to wait for external event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.891 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.891 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.892 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.892 186993 DEBUG nova.virt.libvirt.vif [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:28:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-73136063',display_name='tempest-TestNetworkBasicOps-server-73136063',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-73136063',id=11,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBmKvYbT6bj4t57mDlrUSDmSjI3rzy8h8vceGx1Im7+MUXO41DcyBhm3Ct6eex1O+G5RLMqXn5sL8le7JyNPS5pNGQDvSbn/Ev+RDYQsNA6pKTEQis7SzRTupmQ5oaPFA==',key_name='tempest-TestNetworkBasicOps-1390815627',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-rzca3oo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:28:27Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=9d70dd0a-d1e7-4821-a30b-0f11f1440ae5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.893 186993 DEBUG nova.network.os_vif_util [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.894 186993 DEBUG nova.network.os_vif_util [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.894 186993 DEBUG os_vif [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.895 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.895 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.896 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.900 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.900 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a577958-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.901 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a577958-71, col_values=(('external_ids', {'iface-id': '4a577958-715d-4941-ad19-f7f8b1cf8586', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fb:ca:06', 'vm-uuid': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.903 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:34 compute-0 NetworkManager[55541]: <info>  [1765362514.9048] manager: (tap4a577958-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.905 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.912 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.912 186993 INFO os_vif [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71')
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.969 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.970 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.970 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:fb:ca:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:28:34 compute-0 nova_compute[186989]: 2025-12-10 10:28:34.971 186993 INFO nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Using config drive
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.155 186993 INFO nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Creating config drive at /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.config
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.160 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxnuw8r2z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.292 186993 DEBUG oslo_concurrency.processutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxnuw8r2z" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:36 compute-0 kernel: tap4a577958-71: entered promiscuous mode
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.3510] manager: (tap4a577958-71): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Dec 10 10:28:36 compute-0 ovn_controller[95452]: 2025-12-10T10:28:36Z|00139|binding|INFO|Claiming lport 4a577958-715d-4941-ad19-f7f8b1cf8586 for this chassis.
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.352 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_controller[95452]: 2025-12-10T10:28:36Z|00140|binding|INFO|4a577958-715d-4941-ad19-f7f8b1cf8586: Claiming fa:16:3e:fb:ca:06 10.100.0.9
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.356 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.359 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.369 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ca:06 10.100.0.9'], port_security=['fa:16:3e:fb:ca:06 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ddafa709-2a55-4d17-9bb4-cff67cccfae8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=868d1c18-09a1-433f-94c6-fe8a2c537be6, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=4a577958-715d-4941-ad19-f7f8b1cf8586) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.370 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 4a577958-715d-4941-ad19-f7f8b1cf8586 in datapath 88eae834-d1d3-4f81-a0f5-8439ceb543ad bound to our chassis
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.371 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88eae834-d1d3-4f81-a0f5-8439ceb543ad
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.383 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[467cdcd9-54a1-48a6-b6d6-ef2a968c73ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.384 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88eae834-d1 in ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.386 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88eae834-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.386 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[20b67e7f-2ac3-4840-9bde-5e8d266aa71f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 systemd-udevd[218157]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.387 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[717b65e0-bea1-422e-b06c-0603abeece80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 systemd-machined[153379]: New machine qemu-11-instance-0000000b.
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.399 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[64c37a01-9c73-41fb-b050-1fcf21714c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.4048] device (tap4a577958-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.4056] device (tap4a577958-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.410 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.413 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d1336698-2aef-4c19-bfcd-66a133be8a7a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_controller[95452]: 2025-12-10T10:28:36Z|00141|binding|INFO|Setting lport 4a577958-715d-4941-ad19-f7f8b1cf8586 ovn-installed in OVS
Dec 10 10:28:36 compute-0 ovn_controller[95452]: 2025-12-10T10:28:36Z|00142|binding|INFO|Setting lport 4a577958-715d-4941-ad19-f7f8b1cf8586 up in Southbound
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.417 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.437 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[76dd0ef8-c704-4eb9-b18e-44be3d62ec37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.441 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[12f59f0c-7282-43a7-a47d-ea960988170a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.4420] manager: (tap88eae834-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.471 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[dad3c94b-b871-4bee-b0e0-74cb0e4a7a8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.475 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e8ddda-fd40-485e-99ad-9314a273fc4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.5035] device (tap88eae834-d0): carrier: link connected
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.511 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c790a75d-dc4a-4a1f-a8b7-8e062b7b695b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.540 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[3d243780-bdb1-4906-b27d-4deb85d68ccf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88eae834-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:41:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353172, 'reachable_time': 43673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218190, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.562 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[efd09c60-d103-4fe0-ba62-780e2bf3352d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feeb:41db'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353172, 'tstamp': 353172}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218191, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.582 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[6b85f445-6dae-4a7c-b9ee-1eb967de0dd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88eae834-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:41:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353172, 'reachable_time': 43673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218192, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.614 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[63452d9a-219d-434c-9d53-3c8f5df1f859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.672 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[53f893dc-a25c-495b-9df5-deaa1b398daa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.674 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88eae834-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.674 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.675 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88eae834-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:36 compute-0 NetworkManager[55541]: <info>  [1765362516.6777] manager: (tap88eae834-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec 10 10:28:36 compute-0 kernel: tap88eae834-d0: entered promiscuous mode
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.678 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.680 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88eae834-d0, col_values=(('external_ids', {'iface-id': '638b8493-172c-4908-95a3-17b633db5334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.681 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_controller[95452]: 2025-12-10T10:28:36Z|00143|binding|INFO|Releasing lport 638b8493-172c-4908-95a3-17b633db5334 from this chassis (sb_readonly=0)
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.681 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.684 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88eae834-d1d3-4f81-a0f5-8439ceb543ad.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88eae834-d1d3-4f81-a0f5-8439ceb543ad.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.685 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc1547c-212a-4f4a-acea-374036d62a03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.685 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-88eae834-d1d3-4f81-a0f5-8439ceb543ad
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/88eae834-d1d3-4f81-a0f5-8439ceb543ad.pid.haproxy
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 88eae834-d1d3-4f81-a0f5-8439ceb543ad
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:28:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:36.687 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'env', 'PROCESS_TAG=haproxy-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88eae834-d1d3-4f81-a0f5-8439ceb543ad.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.692 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.890 186993 DEBUG nova.compute.manager [req-f55e226b-db29-4ef9-85b9-95d53d149f61 req-a3278c29-c313-4cce-8cbd-4512b6d3f208 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.891 186993 DEBUG oslo_concurrency.lockutils [req-f55e226b-db29-4ef9-85b9-95d53d149f61 req-a3278c29-c313-4cce-8cbd-4512b6d3f208 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.891 186993 DEBUG oslo_concurrency.lockutils [req-f55e226b-db29-4ef9-85b9-95d53d149f61 req-a3278c29-c313-4cce-8cbd-4512b6d3f208 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.892 186993 DEBUG oslo_concurrency.lockutils [req-f55e226b-db29-4ef9-85b9-95d53d149f61 req-a3278c29-c313-4cce-8cbd-4512b6d3f208 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.892 186993 DEBUG nova.compute.manager [req-f55e226b-db29-4ef9-85b9-95d53d149f61 req-a3278c29-c313-4cce-8cbd-4512b6d3f208 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Processing event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.978 186993 DEBUG nova.network.neutron [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated VIF entry in instance network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.980 186993 DEBUG nova.network.neutron [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.993 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362516.9925108, 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.993 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] VM Started (Lifecycle Event)
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.995 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.997 186993 DEBUG oslo_concurrency.lockutils [req-ad4371e1-730c-468e-b6c0-b2063b50c476 req-8af1d165-1169-4533-ac1a-0641fce14567 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:28:36 compute-0 nova_compute[186989]: 2025-12-10 10:28:36.998 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.003 186993 INFO nova.virt.libvirt.driver [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Instance spawned successfully.
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.003 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.010 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.014 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.024 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.025 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.025 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.026 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.026 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.026 186993 DEBUG nova.virt.libvirt.driver [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.031 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.032 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362516.9935026, 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.032 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] VM Paused (Lifecycle Event)
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.070 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.074 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362516.9981568, 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.074 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] VM Resumed (Lifecycle Event)
Dec 10 10:28:37 compute-0 podman[218231]: 2025-12-10 10:28:37.08806991 +0000 UTC m=+0.060591164 container create 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.113 186993 INFO nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Took 9.41 seconds to spawn the instance on the hypervisor.
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.113 186993 DEBUG nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.117 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.125 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:28:37 compute-0 systemd[1]: Started libpod-conmon-47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e.scope.
Dec 10 10:28:37 compute-0 podman[218231]: 2025-12-10 10:28:37.053595808 +0000 UTC m=+0.026117152 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.153 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:28:37 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:28:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/502738898ef3109e58551bffd974a0461136268b3e97951cd24fa05ee086853f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:28:37 compute-0 podman[218231]: 2025-12-10 10:28:37.195036032 +0000 UTC m=+0.167557286 container init 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 10 10:28:37 compute-0 podman[218231]: 2025-12-10 10:28:37.204921305 +0000 UTC m=+0.177442559 container start 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.207 186993 INFO nova.compute.manager [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Took 9.89 seconds to build instance.
Dec 10 10:28:37 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [NOTICE]   (218250) : New worker (218252) forked
Dec 10 10:28:37 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [NOTICE]   (218250) : Loading success.
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.227 186993 DEBUG oslo_concurrency.lockutils [None req-ac77c6bb-bb63-41f0-be94-8c8a6537ba3f 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:37 compute-0 nova_compute[186989]: 2025-12-10 10:28:37.847 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.093 186993 DEBUG nova.compute.manager [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.094 186993 DEBUG oslo_concurrency.lockutils [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.094 186993 DEBUG oslo_concurrency.lockutils [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.095 186993 DEBUG oslo_concurrency.lockutils [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.095 186993 DEBUG nova.compute.manager [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.096 186993 WARNING nova.compute.manager [req-f5603e4a-7f65-41db-ace0-86e801652726 req-fb37014d-df98-43b9-ba39-6ee8a827c886 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state active and task_state None.
Dec 10 10:28:39 compute-0 nova_compute[186989]: 2025-12-10 10:28:39.906 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:40 compute-0 nova_compute[186989]: 2025-12-10 10:28:40.767 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:40 compute-0 NetworkManager[55541]: <info>  [1765362520.7701] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 10 10:28:40 compute-0 ovn_controller[95452]: 2025-12-10T10:28:40Z|00144|binding|INFO|Releasing lport 638b8493-172c-4908-95a3-17b633db5334 from this chassis (sb_readonly=0)
Dec 10 10:28:40 compute-0 NetworkManager[55541]: <info>  [1765362520.7736] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Dec 10 10:28:40 compute-0 nova_compute[186989]: 2025-12-10 10:28:40.796 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:40 compute-0 ovn_controller[95452]: 2025-12-10T10:28:40Z|00145|binding|INFO|Releasing lport 638b8493-172c-4908-95a3-17b633db5334 from this chassis (sb_readonly=0)
Dec 10 10:28:40 compute-0 nova_compute[186989]: 2025-12-10 10:28:40.803 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:42 compute-0 nova_compute[186989]: 2025-12-10 10:28:42.852 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:43 compute-0 podman[218262]: 2025-12-10 10:28:43.097161236 +0000 UTC m=+0.121287479 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.049 186993 DEBUG nova.compute.manager [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.050 186993 DEBUG nova.compute.manager [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing instance network info cache due to event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.050 186993 DEBUG oslo_concurrency.lockutils [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.050 186993 DEBUG oslo_concurrency.lockutils [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.050 186993 DEBUG nova.network.neutron [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:28:44 compute-0 nova_compute[186989]: 2025-12-10 10:28:44.909 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:45 compute-0 podman[218281]: 2025-12-10 10:28:45.052554989 +0000 UTC m=+0.082776535 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.432 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'name': 'tempest-TestNetworkBasicOps-server-73136063', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '82da19f85bb840d2a70395c3d761ef38', 'user_id': '603f9c3a99e145e4a64248329321a249', 'hostId': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.437 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 / tap4a577958-71 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.438 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7be87ba4-defc-42d4-bd59-6c3aec1890d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.434206', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01ca6e0e-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': '699ba074d8b4b00571a4116fed18508a57819e1b798bfbbf5e65ddc0dbf19788'}]}, 'timestamp': '2025-12-10 10:28:45.439574', '_unique_id': '88e7d0da7b394cebb3613b7663de849d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.442 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.444 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.444 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '186cafdd-f38c-49b6-b069-7eaa39764f2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.444889', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01cb5f3a-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': 'cd4a0be9f6d1f1fa711ded859378fbb0b45c77f526bc769ac9ea4e1720be6b51'}]}, 'timestamp': '2025-12-10 10:28:45.445764', '_unique_id': '4e2464f6f6ce42c3b0fc2d2d0443c561'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.447 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.498 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.499 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ae428af-a9bf-4160-8f51-94396db5dfcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.449309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01d39af6-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'a3b2038f8db0164f770bd69f7b201f1b06a9314b73f38ed17cbc0ac4c2b848bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.449309', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01d3ba4a-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '9a56c85b919af07f6c093dac239e6cc83e43ff8e203e03f6b3682dfb55ec3fc0'}]}, 'timestamp': '2025-12-10 10:28:45.500313', '_unique_id': 'e7252d43ead7430ea5ec8e8eea00a2fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.502 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.504 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd35dc014-1cca-4228-92b4-67b19bf34c90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.504462', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01d474da-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': 'c9d22413dd5b4e9ad5b13af95e7d34a7b96d3e50da26e527658394cda8c5de16'}]}, 'timestamp': '2025-12-10 10:28:45.505124', '_unique_id': 'bce1751b49854ab89eff9aac0f530bb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.506 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.508 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '216593f7-6223-4039-94d4-d03aeaf7f527', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.507967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01d4fcac-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': 'aa063488481db789d5598d48c53b527f50bcd7234fea28ed4c876d7dec16cae8'}]}, 'timestamp': '2025-12-10 10:28:45.508658', '_unique_id': '7b8be65b741f496a9cdf10564377ba8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.509 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.511 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.511 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>]
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.511 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.527 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.527 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '779efe96-9819-439c-93f2-371de00bd422', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.511892', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01d7eb6a-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': 'bb0eadb127bbfe103d13a0a3ec9a475e176935a0c09ebdd4aee9d51e6869de77'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.511892', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01d80064-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': '16473026b4207ea2ce1286b7072b54ab8cab43d782957fa812a8ca3898ebdcea'}]}, 'timestamp': '2025-12-10 10:28:45.528302', '_unique_id': 'b4c7ad03c71848c2a56a0a847968a548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.529 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.531 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>]
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.531 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>]
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.532 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.551 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.551 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5: ceilometer.compute.pollsters.NoVolumeException
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.551 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.552 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-73136063>]
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.552 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.552 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.553 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cf04014-8ffa-456d-8cae-1018c70c3875', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.552567', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01dbcc3a-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'f08feb659d12294643d5afd31daf995d35ccc41c082bea72c789ad28d9fb4058'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.552567', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01dbe5f8-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '2fd61c78844fcba461b8ded9b8106bc95c8423b23ac626d7baca1f038447b79a'}]}, 'timestamp': '2025-12-10 10:28:45.553872', '_unique_id': 'a4172ed2988442e5971828cb94930e28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.554 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.556 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.556 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99b94964-0d25-47bc-a291-0bdbde970c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.556322', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01dc5e48-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': '9ad4c448d5ed8d9ec1109635aee2c868a918b6ec11bf3e0f02ed3dc567f09143'}]}, 'timestamp': '2025-12-10 10:28:45.556972', '_unique_id': '11fd4f41d2d94336ad5f6443823b2b86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.558 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.559 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.559 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/cpu volume: 8170000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '871b54aa-227d-4f25-ab4a-7b46a71238bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8170000000, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'timestamp': '2025-12-10T10:28:45.559432', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '01dcd760-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.824436337, 'message_signature': '30820601b274878b44a4acc6b6f967eca69ea1843faf21bc057917f0313355f8'}]}, 'timestamp': '2025-12-10 10:28:45.560026', '_unique_id': 'e8fc04b9c67d4ae9bf5ee83db2832344'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.561 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.562 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.562 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.562 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee5ac1ea-f31e-482b-9026-4d41375228c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.562376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01dd45ce-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '106b5c2b955caf1160a93ccf30d0ec849a9518acd493298277dd940004d48f9d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.562376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01dd5898-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'ad43848f3d3ee6372053d5f0e934d8904dc86cf89c1b9746a037c4b3d14a3a28'}]}, 'timestamp': '2025-12-10 10:28:45.563307', '_unique_id': '8c151097b9ec428b806a446cd0ce314a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.564 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.565 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.565 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea7407b0-6f5b-4108-b25c-2d8918d805f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.565681', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01ddc97c-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': '4e9755917ca7938ea09a5bd14a10d2aa568435a1ef05c978f95d8105b4c236e0'}]}, 'timestamp': '2025-12-10 10:28:45.566228', '_unique_id': '4210a33e2a874f15a97b308f6b7f7426'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.567 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.568 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.569 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b8f3190-b8a5-4aa2-a20b-4578d14b25cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.568531', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01de377c-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': '0965fdbba8ce746d01052d7b90c48a0bb564a5d0a49b625a575f90fb528ab3f7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.568531', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01de4d7a-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': '347da2693121d89b8c8dec167aba787ff8ce714333d9d459ea54fd1683582200'}]}, 'timestamp': '2025-12-10 10:28:45.569705', '_unique_id': 'c435bc376a17425ba6ce64b1fb3c0ea1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.570 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.572 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.572 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc98db49-94b7-44da-add3-355c5631526c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.572559', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01ded7d6-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': '8618acbefda4ab0d379d7325ec55fb8026b92d5c350fac09d46c688e9ac30a83'}]}, 'timestamp': '2025-12-10 10:28:45.573220', '_unique_id': '676534bc02d04c88950d8436b3c9d746'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.574 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.576 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81b94c21-9a8a-4bcf-9424-fbfe23dc714c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.576257', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01df67e6-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': '78adfcbe0d0e2d8daebd21aa1d757ca8048276593a987e6cc1395cb58a89bafb'}]}, 'timestamp': '2025-12-10 10:28:45.576902', '_unique_id': '601b2b9403e7417b8133599e04261375'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.578 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.579 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ae417b0-d53d-49da-97cf-2087bb404e86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.579620', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01dfee32-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': 'd678f41ad6dbce8353d7ae07f1fe7dd82e769cf3e9833a248e7133b666dafc0c'}]}, 'timestamp': '2025-12-10 10:28:45.580371', '_unique_id': 'c867e58930cc4adab7b92487b3fb4546'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.581 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.583 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.583 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '303fda77-24b5-404e-8d9c-ba9057d2111d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.583172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e07352-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': '020f7dcc73d704a572eaf6200097d47d9c70d38766bba2d4441ed75fa316b192'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.583172', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e08a2c-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.785274136, 'message_signature': '324761b43bd99db75a8449a7c0c047b98075bbe42a7487104a4894651c8b4c16'}]}, 'timestamp': '2025-12-10 10:28:45.584338', '_unique_id': '3a420822b7e4411b88d10e0cfab4f61c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.585 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.587 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53dec124-8e3c-42cd-b63b-a4fbc27731bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': 'instance-0000000b-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-tap4a577958-71', 'timestamp': '2025-12-10T10:28:45.587037', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'tap4a577958-71', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fb:ca:06', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4a577958-71'}, 'message_id': '01e10844-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.707620974, 'message_signature': 'f1b91c091963f2ea679ee353efef1b6871be2c7ab99ef7811f1c1e6ae457a700'}]}, 'timestamp': '2025-12-10 10:28:45.587471', '_unique_id': 'e3f6d210175b4a3e846ad2bf73a0e395'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.588 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.589 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.589 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8cf50fe-ca15-413d-9ebf-a7d297b968ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.589505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e16988-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'a6f6ca1677c56735d5012f1c07f4d22712cd3311909fb3e035a24978662653ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.589505', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e17720-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '525b3a5664413b72ff99c7ebfdf00280c2434fbb1f3ae04d2344edb5c4bf5176'}]}, 'timestamp': '2025-12-10 10:28:45.590239', '_unique_id': 'ab0b571fc9f44e928cadd5cada6ce679'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.590 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.591 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.latency volume: 149716911 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.592 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.read.latency volume: 849163 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ddbabeb-9126-414f-a6ef-6cc181e99382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 149716911, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.591786', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e1c018-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'f8a7df7e7aefafe5f0f3f5817adc24e51fc06c9e55bcb79dfafd20661630e610'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 849163, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.591786', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e1cc48-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '6ada568d3c19b17edb478ab8d369d74f027c62548965737c206b05ff6dac6479'}]}, 'timestamp': '2025-12-10 10:28:45.592410', '_unique_id': '59f3122d98174b048cf61ddc571b273c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.594 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.594 12 DEBUG ceilometer.compute.pollsters [-] 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d1bb3d2-2fae-4bed-b6f0-4b3e5489a662', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-vda', 'timestamp': '2025-12-10T10:28:45.594021', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e217c0-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': '8766c2359355a113053c7271ff1b180bc714cf29f24f366d2af6288b40d58329'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_name': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_name': None, 'resource_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-sda', 'timestamp': '2025-12-10T10:28:45.594021', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-73136063', 'name': 'instance-0000000b', 'instance_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'instance_type': 'm1.nano', 'host': 'ec8a1fa356a1608901604d711aae98fa22bf1c8712e20c4d861445b0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6f9bf686-c5d3-4e9c-a944-269864569e67', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}, 'image_ref': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e22616-d5b3-11f0-b3c8-fa163e2093b1', 'monotonic_time': 3540.72270023, 'message_signature': 'b7d91fe639e0f0319121555e926a7dd190d743d164d419ebcb856087d86968ee'}]}, 'timestamp': '2025-12-10 10:28:45.594793', '_unique_id': 'cfe6d96ada0144a498c2a90e591c5926'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     yield
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 10 10:28:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:28:45.595 12 ERROR oslo_messaging.notify.messaging 
Dec 10 10:28:45 compute-0 nova_compute[186989]: 2025-12-10 10:28:45.911 186993 DEBUG nova.network.neutron [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated VIF entry in instance network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:28:45 compute-0 nova_compute[186989]: 2025-12-10 10:28:45.912 186993 DEBUG nova.network.neutron [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:45 compute-0 nova_compute[186989]: 2025-12-10 10:28:45.938 186993 DEBUG oslo_concurrency.lockutils [req-76554266-77aa-4653-a186-2abed082f6bb req-8db22fc3-894f-462c-8746-a077def29ee8 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:28:47 compute-0 nova_compute[186989]: 2025-12-10 10:28:47.857 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:48 compute-0 ovn_controller[95452]: 2025-12-10T10:28:48Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fb:ca:06 10.100.0.9
Dec 10 10:28:48 compute-0 ovn_controller[95452]: 2025-12-10T10:28:48Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fb:ca:06 10.100.0.9
Dec 10 10:28:48 compute-0 nova_compute[186989]: 2025-12-10 10:28:48.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:49 compute-0 nova_compute[186989]: 2025-12-10 10:28:49.918 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:51 compute-0 nova_compute[186989]: 2025-12-10 10:28:51.899 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:51 compute-0 nova_compute[186989]: 2025-12-10 10:28:51.900 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:51 compute-0 nova_compute[186989]: 2025-12-10 10:28:51.949 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.076 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.077 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.101 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.102 186993 INFO nova.compute.claims [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.258 186993 DEBUG nova.compute.provider_tree [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.280 186993 DEBUG nova.scheduler.client.report [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.352 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.354 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.475 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.475 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.521 186993 INFO nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.544 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.677 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.678 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.678 186993 INFO nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Creating image(s)
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.680 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.680 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.681 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.693 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.748 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.750 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.751 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.774 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.827 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.828 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.859 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.883 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.884 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.884 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.967 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.968 186993 DEBUG nova.virt.disk.api [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:28:52 compute-0 nova_compute[186989]: 2025-12-10 10:28:52.969 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.033 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.034 186993 DEBUG nova.virt.disk.api [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.035 186993 DEBUG nova.objects.instance [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid 4756e517-16f2-43b0-809d-2464cbd9e219 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.095 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.095 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Ensure instance console log exists: /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.096 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.096 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.096 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.602 186993 DEBUG nova.policy [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:28:53 compute-0 nova_compute[186989]: 2025-12-10 10:28:53.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:54 compute-0 nova_compute[186989]: 2025-12-10 10:28:54.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:54 compute-0 nova_compute[186989]: 2025-12-10 10:28:54.957 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.089 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Successfully created port: b770b01f-4978-4c34-b9c6-d512018a5dc3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.693 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Successfully updated port: b770b01f-4978-4c34-b9c6-d512018a5dc3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.715 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.715 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.716 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.779 186993 DEBUG nova.compute.manager [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.779 186993 DEBUG nova.compute.manager [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing instance network info cache due to event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.780 186993 DEBUG oslo_concurrency.lockutils [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:55 compute-0 nova_compute[186989]: 2025-12-10 10:28:55.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:56 compute-0 nova_compute[186989]: 2025-12-10 10:28:56.547 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:28:56 compute-0 nova_compute[186989]: 2025-12-10 10:28:56.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:28:56 compute-0 nova_compute[186989]: 2025-12-10 10:28:56.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:28:56 compute-0 nova_compute[186989]: 2025-12-10 10:28:56.923 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:28:56 compute-0 nova_compute[186989]: 2025-12-10 10:28:56.950 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 10 10:28:57 compute-0 podman[218336]: 2025-12-10 10:28:57.048041106 +0000 UTC m=+0.079663568 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.554 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.554 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.554 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.555 186993 DEBUG nova.objects.instance [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.862 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.882 186993 DEBUG nova.network.neutron [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updating instance_info_cache with network_info: [{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.963 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.963 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Instance network_info: |[{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.964 186993 DEBUG oslo_concurrency.lockutils [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.965 186993 DEBUG nova.network.neutron [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.971 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Start _get_guest_xml network_info=[{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.977 186993 WARNING nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.984 186993 DEBUG nova.virt.libvirt.host [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.985 186993 DEBUG nova.virt.libvirt.host [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.993 186993 DEBUG nova.virt.libvirt.host [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.994 186993 DEBUG nova.virt.libvirt.host [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.994 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.995 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.995 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.996 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.996 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.996 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.997 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.997 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.997 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.998 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.998 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:28:57 compute-0 nova_compute[186989]: 2025-12-10 10:28:57.998 186993 DEBUG nova.virt.hardware [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.003 186993 DEBUG nova.virt.libvirt.vif [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-165096650',display_name='tempest-TestNetworkBasicOps-server-165096650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-165096650',id=12,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAAj2H/3GzC0Ce7KCOnImnvh4ajteZWzrvKUT9mLfsL+YWJqIYjPMlma/vIkHFmwRVulR4YUwjjrmyih6KyZ2A7hIjhdLZE7AD/EHJJn8DSVj8AGGdL1krcc6f7OAU/hvA==',key_name='tempest-TestNetworkBasicOps-1984468410',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-0n1sppf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:28:52Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=4756e517-16f2-43b0-809d-2464cbd9e219,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.003 186993 DEBUG nova.network.os_vif_util [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.004 186993 DEBUG nova.network.os_vif_util [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.005 186993 DEBUG nova.objects.instance [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4756e517-16f2-43b0-809d-2464cbd9e219 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.025 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <uuid>4756e517-16f2-43b0-809d-2464cbd9e219</uuid>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <name>instance-0000000c</name>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-165096650</nova:name>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:28:57</nova:creationTime>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         <nova:port uuid="b770b01f-4978-4c34-b9c6-d512018a5dc3">
Dec 10 10:28:58 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <system>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="serial">4756e517-16f2-43b0-809d-2464cbd9e219</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="uuid">4756e517-16f2-43b0-809d-2464cbd9e219</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </system>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <os>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </os>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <features>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </features>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.config"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:31:00:2a"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <target dev="tapb770b01f-49"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/console.log" append="off"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <video>
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </video>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:28:58 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:28:58 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:28:58 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:28:58 compute-0 nova_compute[186989]: </domain>
Dec 10 10:28:58 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.027 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Preparing to wait for external event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.027 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.028 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.028 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.029 186993 DEBUG nova.virt.libvirt.vif [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-165096650',display_name='tempest-TestNetworkBasicOps-server-165096650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-165096650',id=12,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAAj2H/3GzC0Ce7KCOnImnvh4ajteZWzrvKUT9mLfsL+YWJqIYjPMlma/vIkHFmwRVulR4YUwjjrmyih6KyZ2A7hIjhdLZE7AD/EHJJn8DSVj8AGGdL1krcc6f7OAU/hvA==',key_name='tempest-TestNetworkBasicOps-1984468410',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-0n1sppf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:28:52Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=4756e517-16f2-43b0-809d-2464cbd9e219,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.029 186993 DEBUG nova.network.os_vif_util [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.030 186993 DEBUG nova.network.os_vif_util [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.030 186993 DEBUG os_vif [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.031 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.031 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.032 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.037 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.037 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb770b01f-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.038 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb770b01f-49, col_values=(('external_ids', {'iface-id': 'b770b01f-4978-4c34-b9c6-d512018a5dc3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:00:2a', 'vm-uuid': '4756e517-16f2-43b0-809d-2464cbd9e219'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.039 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:58 compute-0 NetworkManager[55541]: <info>  [1765362538.0413] manager: (tapb770b01f-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.041 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.047 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.048 186993 INFO os_vif [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49')
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.131 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.132 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.132 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:31:00:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.134 186993 INFO nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Using config drive
Dec 10 10:28:58 compute-0 podman[218363]: 2025-12-10 10:28:58.197442978 +0000 UTC m=+0.107910659 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.942 186993 INFO nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Creating config drive at /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.config
Dec 10 10:28:58 compute-0 nova_compute[186989]: 2025-12-10 10:28:58.953 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8g8nla8x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.097 186993 DEBUG oslo_concurrency.processutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8g8nla8x" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:28:59 compute-0 NetworkManager[55541]: <info>  [1765362539.1765] manager: (tapb770b01f-49): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Dec 10 10:28:59 compute-0 kernel: tapb770b01f-49: entered promiscuous mode
Dec 10 10:28:59 compute-0 ovn_controller[95452]: 2025-12-10T10:28:59Z|00146|binding|INFO|Claiming lport b770b01f-4978-4c34-b9c6-d512018a5dc3 for this chassis.
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.179 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:59 compute-0 ovn_controller[95452]: 2025-12-10T10:28:59Z|00147|binding|INFO|b770b01f-4978-4c34-b9c6-d512018a5dc3: Claiming fa:16:3e:31:00:2a 10.100.0.12
Dec 10 10:28:59 compute-0 ovn_controller[95452]: 2025-12-10T10:28:59Z|00148|binding|INFO|Setting lport b770b01f-4978-4c34-b9c6-d512018a5dc3 ovn-installed in OVS
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.210 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.215 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:59 compute-0 systemd-udevd[218401]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:28:59 compute-0 systemd-machined[153379]: New machine qemu-12-instance-0000000c.
Dec 10 10:28:59 compute-0 NetworkManager[55541]: <info>  [1765362539.2483] device (tapb770b01f-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:28:59 compute-0 NetworkManager[55541]: <info>  [1765362539.2509] device (tapb770b01f-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:28:59 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Dec 10 10:28:59 compute-0 ovn_controller[95452]: 2025-12-10T10:28:59Z|00149|binding|INFO|Setting lport b770b01f-4978-4c34-b9c6-d512018a5dc3 up in Southbound
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.304 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:00:2a 10.100.0.12'], port_security=['fa:16:3e:31:00:2a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4756e517-16f2-43b0-809d-2464cbd9e219', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '809d373c-06a5-4972-8e17-4f646033bb2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=868d1c18-09a1-433f-94c6-fe8a2c537be6, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=b770b01f-4978-4c34-b9c6-d512018a5dc3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.305 104302 INFO neutron.agent.ovn.metadata.agent [-] Port b770b01f-4978-4c34-b9c6-d512018a5dc3 in datapath 88eae834-d1d3-4f81-a0f5-8439ceb543ad bound to our chassis
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.306 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88eae834-d1d3-4f81-a0f5-8439ceb543ad
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.321 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ccb1fe39-f493-4403-add6-dc69bf20dae9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.357 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[7b32d7ca-ade0-4d45-a43c-746b17390b47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.362 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[3068a691-5cd8-4a51-9e29-1d66f39d3576]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.397 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[06cf05d6-aa63-4f93-a0e6-e18b71bc110b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.418 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[136cac1f-f18f-40a3-8df1-09d7cc0005ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88eae834-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:41:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353172, 'reachable_time': 43673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218416, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.439 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[85175d45-484e-4e25-9f7d-7b4a7291558c]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88eae834-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353185, 'tstamp': 353185}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218417, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88eae834-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353188, 'tstamp': 353188}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218417, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.441 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88eae834-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.443 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.444 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.447 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88eae834-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.448 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.448 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88eae834-d0, col_values=(('external_ids', {'iface-id': '638b8493-172c-4908-95a3-17b633db5334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:28:59 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:28:59.449 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.736 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362539.7322328, 4756e517-16f2-43b0-809d-2464cbd9e219 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.736 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] VM Started (Lifecycle Event)
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.767 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.771 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362539.734831, 4756e517-16f2-43b0-809d-2464cbd9e219 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.772 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] VM Paused (Lifecycle Event)
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.796 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.799 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.808 186993 DEBUG nova.compute.manager [req-fef6eeb9-0d0b-4165-9444-fc4d78adfb10 req-f041ef2b-c302-4559-a22a-8bbd790feb6a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.808 186993 DEBUG oslo_concurrency.lockutils [req-fef6eeb9-0d0b-4165-9444-fc4d78adfb10 req-f041ef2b-c302-4559-a22a-8bbd790feb6a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.809 186993 DEBUG oslo_concurrency.lockutils [req-fef6eeb9-0d0b-4165-9444-fc4d78adfb10 req-f041ef2b-c302-4559-a22a-8bbd790feb6a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.809 186993 DEBUG oslo_concurrency.lockutils [req-fef6eeb9-0d0b-4165-9444-fc4d78adfb10 req-f041ef2b-c302-4559-a22a-8bbd790feb6a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.809 186993 DEBUG nova.compute.manager [req-fef6eeb9-0d0b-4165-9444-fc4d78adfb10 req-f041ef2b-c302-4559-a22a-8bbd790feb6a 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Processing event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.810 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.814 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.820 186993 INFO nova.virt.libvirt.driver [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Instance spawned successfully.
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.820 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.840 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.840 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362539.8134315, 4756e517-16f2-43b0-809d-2464cbd9e219 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.841 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] VM Resumed (Lifecycle Event)
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.854 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.854 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.855 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.856 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.857 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.857 186993 DEBUG nova.virt.libvirt.driver [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.868 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.872 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.903 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.923 186993 INFO nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Took 7.25 seconds to spawn the instance on the hypervisor.
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.923 186993 DEBUG nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:28:59 compute-0 nova_compute[186989]: 2025-12-10 10:28:59.988 186993 INFO nova.compute.manager [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Took 7.96 seconds to build instance.
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.010 186993 DEBUG oslo_concurrency.lockutils [None req-19f4559a-6f70-47c1-be62-8f4894eb799a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.626 186993 DEBUG nova.network.neutron [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.714 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.715 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.715 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.755 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.755 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.756 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.756 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.808 186993 DEBUG nova.network.neutron [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updated VIF entry in instance network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.809 186993 DEBUG nova.network.neutron [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updating instance_info_cache with network_info: [{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.832 186993 DEBUG oslo_concurrency.lockutils [req-0613f8a3-35b0-49da-a938-0c5fd671ad85 req-524e1c5c-5b49-4c8f-a9cc-d556d6880203 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.895 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.992 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:29:00 compute-0 nova_compute[186989]: 2025-12-10 10:29:00.993 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.060 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.066 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.127 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.129 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.204 186993 DEBUG oslo_concurrency.processutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.398 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.400 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5481MB free_disk=73.30024337768555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.400 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.400 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.491 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.492 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Instance 4756e517-16f2-43b0-809d-2464cbd9e219 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.492 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.492 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.559 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.572 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.595 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:29:01 compute-0 nova_compute[186989]: 2025-12-10 10:29:01.595 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.082 186993 DEBUG nova.compute.manager [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.082 186993 DEBUG oslo_concurrency.lockutils [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.083 186993 DEBUG oslo_concurrency.lockutils [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.083 186993 DEBUG oslo_concurrency.lockutils [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.083 186993 DEBUG nova.compute.manager [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] No waiting events found dispatching network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.083 186993 WARNING nova.compute.manager [req-20d09feb-18bb-4009-a88e-5bb0e963f90e req-90111830-783a-48be-8dfb-0cf2824af64e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received unexpected event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 for instance with vm_state active and task_state None.
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.590 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:02 compute-0 nova_compute[186989]: 2025-12-10 10:29:02.924 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:03 compute-0 nova_compute[186989]: 2025-12-10 10:29:03.039 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:04 compute-0 nova_compute[186989]: 2025-12-10 10:29:04.262 186993 DEBUG nova.compute.manager [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:04 compute-0 nova_compute[186989]: 2025-12-10 10:29:04.263 186993 DEBUG nova.compute.manager [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing instance network info cache due to event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:29:04 compute-0 nova_compute[186989]: 2025-12-10 10:29:04.263 186993 DEBUG oslo_concurrency.lockutils [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:29:04 compute-0 nova_compute[186989]: 2025-12-10 10:29:04.263 186993 DEBUG oslo_concurrency.lockutils [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:29:04 compute-0 nova_compute[186989]: 2025-12-10 10:29:04.263 186993 DEBUG nova.network.neutron [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:29:05 compute-0 podman[218438]: 2025-12-10 10:29:05.072560244 +0000 UTC m=+0.094896930 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 10 10:29:05 compute-0 podman[218439]: 2025-12-10 10:29:05.10394769 +0000 UTC m=+0.116862236 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd)
Dec 10 10:29:05 compute-0 podman[218440]: 2025-12-10 10:29:05.130567205 +0000 UTC m=+0.146749581 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Dec 10 10:29:07 compute-0 nova_compute[186989]: 2025-12-10 10:29:07.927 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:08 compute-0 nova_compute[186989]: 2025-12-10 10:29:08.041 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:08 compute-0 nova_compute[186989]: 2025-12-10 10:29:08.056 186993 DEBUG nova.network.neutron [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updated VIF entry in instance network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:08 compute-0 nova_compute[186989]: 2025-12-10 10:29:08.057 186993 DEBUG nova.network.neutron [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updating instance_info_cache with network_info: [{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:08 compute-0 nova_compute[186989]: 2025-12-10 10:29:08.792 186993 DEBUG oslo_concurrency.lockutils [req-1815fad9-3781-406f-97fd-97be45389aeb req-3fecfb47-38a6-4918-9e65-db0f3d2769d4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:11 compute-0 ovn_controller[95452]: 2025-12-10T10:29:11Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:31:00:2a 10.100.0.12
Dec 10 10:29:11 compute-0 ovn_controller[95452]: 2025-12-10T10:29:11Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:31:00:2a 10.100.0.12
Dec 10 10:29:12 compute-0 nova_compute[186989]: 2025-12-10 10:29:12.928 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:13 compute-0 nova_compute[186989]: 2025-12-10 10:29:13.044 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:14 compute-0 podman[218511]: 2025-12-10 10:29:14.025626187 +0000 UTC m=+0.068118031 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=)
Dec 10 10:29:16 compute-0 podman[218532]: 2025-12-10 10:29:16.014583727 +0000 UTC m=+0.062656640 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:29:17 compute-0 nova_compute[186989]: 2025-12-10 10:29:17.931 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:18 compute-0 nova_compute[186989]: 2025-12-10 10:29:18.045 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:18 compute-0 nova_compute[186989]: 2025-12-10 10:29:18.814 186993 INFO nova.compute.manager [None req-31d79682-b675-4801-b940-c275a6c9022d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Get console output
Dec 10 10:29:18 compute-0 nova_compute[186989]: 2025-12-10 10:29:18.821 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:29:19 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:19.806 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.807 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:19 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:19.807 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.911 186993 DEBUG nova.compute.manager [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.911 186993 DEBUG nova.compute.manager [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing instance network info cache due to event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.912 186993 DEBUG oslo_concurrency.lockutils [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.912 186993 DEBUG oslo_concurrency.lockutils [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.913 186993 DEBUG nova.network.neutron [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.990 186993 DEBUG nova.compute.manager [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.991 186993 DEBUG oslo_concurrency.lockutils [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.991 186993 DEBUG oslo_concurrency.lockutils [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.991 186993 DEBUG oslo_concurrency.lockutils [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.991 186993 DEBUG nova.compute.manager [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:19 compute-0 nova_compute[186989]: 2025-12-10 10:29:19.992 186993 WARNING nova.compute.manager [req-7ffbfb5f-8ee6-4fcb-b4e3-138b2b349d09 req-4801d223-40ab-45b2-8f17-6744037b7d33 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state active and task_state None.
Dec 10 10:29:20 compute-0 nova_compute[186989]: 2025-12-10 10:29:20.943 186993 DEBUG nova.network.neutron [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated VIF entry in instance network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:20 compute-0 nova_compute[186989]: 2025-12-10 10:29:20.943 186993 DEBUG nova.network.neutron [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:21 compute-0 nova_compute[186989]: 2025-12-10 10:29:21.760 186993 INFO nova.compute.manager [None req-7bd1898a-3f03-43ae-9c0f-d3d23762e17a 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Get console output
Dec 10 10:29:21 compute-0 nova_compute[186989]: 2025-12-10 10:29:21.766 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.027 186993 DEBUG oslo_concurrency.lockutils [req-181aa342-819d-459f-b981-ec09103abbe5 req-afb1ef41-7413-4454-87ba-1edce927321f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.378 186993 DEBUG nova.compute.manager [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.378 186993 DEBUG oslo_concurrency.lockutils [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.380 186993 DEBUG oslo_concurrency.lockutils [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.381 186993 DEBUG oslo_concurrency.lockutils [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.381 186993 DEBUG nova.compute.manager [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.381 186993 WARNING nova.compute.manager [req-5a41a20f-a094-4cfb-86e9-ab146b3db7ad req-29d6424b-f408-4867-866f-c5ebeb7ed47b 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state active and task_state None.
Dec 10 10:29:22 compute-0 nova_compute[186989]: 2025-12-10 10:29:22.933 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:23 compute-0 nova_compute[186989]: 2025-12-10 10:29:23.047 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.727 186993 DEBUG nova.compute.manager [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.728 186993 DEBUG nova.compute.manager [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing instance network info cache due to event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.729 186993 DEBUG oslo_concurrency.lockutils [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.729 186993 DEBUG oslo_concurrency.lockutils [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.730 186993 DEBUG nova.network.neutron [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.787 186993 DEBUG nova.compute.manager [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.787 186993 DEBUG oslo_concurrency.lockutils [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.788 186993 DEBUG oslo_concurrency.lockutils [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.788 186993 DEBUG oslo_concurrency.lockutils [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.789 186993 DEBUG nova.compute.manager [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.789 186993 WARNING nova.compute.manager [req-38833181-b0aa-4483-867e-e0f12fe0f9a5 req-e53e0f93-d2ad-4451-a5b6-78b8481dc3f4 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state active and task_state None.
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.895 186993 INFO nova.compute.manager [None req-57099b0e-9b76-46f9-90a5-7f3ca58ef368 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Get console output
Dec 10 10:29:24 compute-0 nova_compute[186989]: 2025-12-10 10:29:24.901 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.749 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.750 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.750 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.751 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.752 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.754 186993 INFO nova.compute.manager [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Terminating instance
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.756 186993 DEBUG nova.compute.manager [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:29:26 compute-0 kernel: tapb770b01f-49 (unregistering): left promiscuous mode
Dec 10 10:29:26 compute-0 NetworkManager[55541]: <info>  [1765362566.7800] device (tapb770b01f-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:29:26 compute-0 ovn_controller[95452]: 2025-12-10T10:29:26Z|00150|binding|INFO|Releasing lport b770b01f-4978-4c34-b9c6-d512018a5dc3 from this chassis (sb_readonly=0)
Dec 10 10:29:26 compute-0 ovn_controller[95452]: 2025-12-10T10:29:26Z|00151|binding|INFO|Setting lport b770b01f-4978-4c34-b9c6-d512018a5dc3 down in Southbound
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.796 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:26 compute-0 ovn_controller[95452]: 2025-12-10T10:29:26Z|00152|binding|INFO|Removing iface tapb770b01f-49 ovn-installed in OVS
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.803 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:00:2a 10.100.0.12'], port_security=['fa:16:3e:31:00:2a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '4756e517-16f2-43b0-809d-2464cbd9e219', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '809d373c-06a5-4972-8e17-4f646033bb2e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=868d1c18-09a1-433f-94c6-fe8a2c537be6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=b770b01f-4978-4c34-b9c6-d512018a5dc3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.805 104302 INFO neutron.agent.ovn.metadata.agent [-] Port b770b01f-4978-4c34-b9c6-d512018a5dc3 in datapath 88eae834-d1d3-4f81-a0f5-8439ceb543ad unbound from our chassis
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.806 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88eae834-d1d3-4f81-a0f5-8439ceb543ad
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.813 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.822 186993 DEBUG nova.compute.manager [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.823 186993 DEBUG nova.compute.manager [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing instance network info cache due to event network-changed-b770b01f-4978-4c34-b9c6-d512018a5dc3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.824 186993 DEBUG oslo_concurrency.lockutils [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.824 186993 DEBUG oslo_concurrency.lockutils [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.824 186993 DEBUG nova.network.neutron [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Refreshing network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.827 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba983a3-c989-4d1e-93f8-e1b1af9ff5ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 10 10:29:26 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 12.695s CPU time.
Dec 10 10:29:26 compute-0 systemd-machined[153379]: Machine qemu-12-instance-0000000c terminated.
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.866 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[2e052bd5-8682-4208-8e7d-fa30352dcf4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.869 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac40ef2-59b4-4253-aea1-dc45b01f130d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.892 186993 DEBUG nova.network.neutron [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated VIF entry in instance network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.893 186993 DEBUG nova.network.neutron [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.900 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[ade47caa-eea6-4579-8842-a878f435a291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.905 186993 DEBUG nova.compute.manager [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.905 186993 DEBUG oslo_concurrency.lockutils [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.905 186993 DEBUG oslo_concurrency.lockutils [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.905 186993 DEBUG oslo_concurrency.lockutils [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.906 186993 DEBUG nova.compute.manager [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.906 186993 WARNING nova.compute.manager [req-a10cd244-f417-4482-8292-5bfbdb0de012 req-78fb9eb1-05d7-415c-a37b-65b05b60b088 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state active and task_state None.
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.909 186993 DEBUG oslo_concurrency.lockutils [req-ff3035ef-429d-4e5d-b048-6e45d735d5f7 req-8d04a7f1-95ec-49b2-b7a1-ce15ac47cbad 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.919 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c19e95-79b0-4a30-9402-e6f8ca0c8b04]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88eae834-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:eb:41:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353172, 'reachable_time': 43673, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218568, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.937 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6b290d-b1a6-4827-b7d6-ee66136293e1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap88eae834-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353185, 'tstamp': 353185}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218569, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap88eae834-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353188, 'tstamp': 353188}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218569, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.939 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88eae834-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.940 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:26 compute-0 nova_compute[186989]: 2025-12-10 10:29:26.945 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.946 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88eae834-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.946 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.946 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88eae834-d0, col_values=(('external_ids', {'iface-id': '638b8493-172c-4908-95a3-17b633db5334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:26 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:26.946 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.017 186993 INFO nova.virt.libvirt.driver [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Instance destroyed successfully.
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.018 186993 DEBUG nova.objects.instance [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 4756e517-16f2-43b0-809d-2464cbd9e219 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.060 186993 DEBUG nova.virt.libvirt.vif [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:28:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-165096650',display_name='tempest-TestNetworkBasicOps-server-165096650',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-165096650',id=12,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAAj2H/3GzC0Ce7KCOnImnvh4ajteZWzrvKUT9mLfsL+YWJqIYjPMlma/vIkHFmwRVulR4YUwjjrmyih6KyZ2A7hIjhdLZE7AD/EHJJn8DSVj8AGGdL1krcc6f7OAU/hvA==',key_name='tempest-TestNetworkBasicOps-1984468410',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:28:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-0n1sppf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:28:59Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=4756e517-16f2-43b0-809d-2464cbd9e219,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.061 186993 DEBUG nova.network.os_vif_util [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.061 186993 DEBUG nova.network.os_vif_util [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.062 186993 DEBUG os_vif [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.065 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.065 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb770b01f-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.068 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.069 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.073 186993 INFO os_vif [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:00:2a,bridge_name='br-int',has_traffic_filtering=True,id=b770b01f-4978-4c34-b9c6-d512018a5dc3,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb770b01f-49')
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.074 186993 INFO nova.virt.libvirt.driver [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Deleting instance files /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219_del
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.075 186993 INFO nova.virt.libvirt.driver [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Deletion of /var/lib/nova/instances/4756e517-16f2-43b0-809d-2464cbd9e219_del complete
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.120 186993 INFO nova.compute.manager [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.121 186993 DEBUG oslo.service.loopingcall [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.121 186993 DEBUG nova.compute.manager [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.121 186993 DEBUG nova.network.neutron [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:29:27 compute-0 nova_compute[186989]: 2025-12-10 10:29:27.935 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:28 compute-0 podman[218588]: 2025-12-10 10:29:28.011931045 +0000 UTC m=+0.052095008 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.772 186993 DEBUG nova.network.neutron [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.790 186993 INFO nova.compute.manager [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Took 1.67 seconds to deallocate network for instance.
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.846 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.846 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.926 186993 DEBUG nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-vif-unplugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.927 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.927 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.927 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.928 186993 DEBUG nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] No waiting events found dispatching network-vif-unplugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.928 186993 WARNING nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received unexpected event network-vif-unplugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 for instance with vm_state deleted and task_state None.
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.928 186993 DEBUG nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.929 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.929 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.929 186993 DEBUG oslo_concurrency.lockutils [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.929 186993 DEBUG nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] No waiting events found dispatching network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.930 186993 WARNING nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received unexpected event network-vif-plugged-b770b01f-4978-4c34-b9c6-d512018a5dc3 for instance with vm_state deleted and task_state None.
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.930 186993 DEBUG nova.compute.manager [req-5c782988-31b3-4c97-9331-237a095f83f0 req-4681ab5c-7e72-4b79-91a6-a12e9a705ed9 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Received event network-vif-deleted-b770b01f-4978-4c34-b9c6-d512018a5dc3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.964 186993 DEBUG nova.compute.provider_tree [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:29:28 compute-0 nova_compute[186989]: 2025-12-10 10:29:28.981 186993 DEBUG nova.scheduler.client.report [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.004 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.052 186993 INFO nova.scheduler.client.report [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 4756e517-16f2-43b0-809d-2464cbd9e219
Dec 10 10:29:29 compute-0 podman[218613]: 2025-12-10 10:29:29.053745297 +0000 UTC m=+0.090717115 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.158 186993 DEBUG oslo_concurrency.lockutils [None req-2a75e488-86de-483b-8297-752d710c4f64 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "4756e517-16f2-43b0-809d-2464cbd9e219" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.583 186993 DEBUG nova.network.neutron [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updated VIF entry in instance network info cache for port b770b01f-4978-4c34-b9c6-d512018a5dc3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.583 186993 DEBUG nova.network.neutron [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Updating instance_info_cache with network_info: [{"id": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "address": "fa:16:3e:31:00:2a", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb770b01f-49", "ovs_interfaceid": "b770b01f-4978-4c34-b9c6-d512018a5dc3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:29 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:29.811 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:29 compute-0 nova_compute[186989]: 2025-12-10 10:29:29.919 186993 DEBUG oslo_concurrency.lockutils [req-16625b9f-ad9e-462f-9a27-bf34688fe23b req-90e9b803-e9df-4479-9da7-a7056267dc79 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-4756e517-16f2-43b0-809d-2464cbd9e219" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:31.472 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:31.472 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:31.474 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:32 compute-0 nova_compute[186989]: 2025-12-10 10:29:32.069 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:32 compute-0 nova_compute[186989]: 2025-12-10 10:29:32.937 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.044 186993 DEBUG nova.compute.manager [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.045 186993 DEBUG nova.compute.manager [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing instance network info cache due to event network-changed-4a577958-715d-4941-ad19-f7f8b1cf8586. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.045 186993 DEBUG oslo_concurrency.lockutils [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.046 186993 DEBUG oslo_concurrency.lockutils [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.046 186993 DEBUG nova.network.neutron [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Refreshing network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.156 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.157 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.157 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.157 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.158 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.159 186993 INFO nova.compute.manager [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Terminating instance
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.160 186993 DEBUG nova.compute.manager [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:29:34 compute-0 kernel: tap4a577958-71 (unregistering): left promiscuous mode
Dec 10 10:29:34 compute-0 NetworkManager[55541]: <info>  [1765362574.1936] device (tap4a577958-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.196 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 ovn_controller[95452]: 2025-12-10T10:29:34Z|00153|binding|INFO|Releasing lport 4a577958-715d-4941-ad19-f7f8b1cf8586 from this chassis (sb_readonly=0)
Dec 10 10:29:34 compute-0 ovn_controller[95452]: 2025-12-10T10:29:34Z|00154|binding|INFO|Setting lport 4a577958-715d-4941-ad19-f7f8b1cf8586 down in Southbound
Dec 10 10:29:34 compute-0 ovn_controller[95452]: 2025-12-10T10:29:34Z|00155|binding|INFO|Removing iface tap4a577958-71 ovn-installed in OVS
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.199 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.204 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fb:ca:06 10.100.0.9'], port_security=['fa:16:3e:fb:ca:06 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9d70dd0a-d1e7-4821-a30b-0f11f1440ae5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ddafa709-2a55-4d17-9bb4-cff67cccfae8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=868d1c18-09a1-433f-94c6-fe8a2c537be6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=4a577958-715d-4941-ad19-f7f8b1cf8586) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.206 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 4a577958-715d-4941-ad19-f7f8b1cf8586 in datapath 88eae834-d1d3-4f81-a0f5-8439ceb543ad unbound from our chassis
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.207 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88eae834-d1d3-4f81-a0f5-8439ceb543ad, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.208 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[8b743e05-4ed0-420c-94e9-0ef298cd4725]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.209 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad namespace which is not needed anymore
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.213 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 10 10:29:34 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.668s CPU time.
Dec 10 10:29:34 compute-0 systemd-machined[153379]: Machine qemu-11-instance-0000000b terminated.
Dec 10 10:29:34 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [NOTICE]   (218250) : haproxy version is 2.8.14-c23fe91
Dec 10 10:29:34 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [NOTICE]   (218250) : path to executable is /usr/sbin/haproxy
Dec 10 10:29:34 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [WARNING]  (218250) : Exiting Master process...
Dec 10 10:29:34 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [ALERT]    (218250) : Current worker (218252) exited with code 143 (Terminated)
Dec 10 10:29:34 compute-0 neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad[218246]: [WARNING]  (218250) : All workers exited. Exiting... (0)
Dec 10 10:29:34 compute-0 systemd[1]: libpod-47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e.scope: Deactivated successfully.
Dec 10 10:29:34 compute-0 podman[218657]: 2025-12-10 10:29:34.355454531 +0000 UTC m=+0.049363353 container died 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 10 10:29:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e-userdata-shm.mount: Deactivated successfully.
Dec 10 10:29:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-502738898ef3109e58551bffd974a0461136268b3e97951cd24fa05ee086853f-merged.mount: Deactivated successfully.
Dec 10 10:29:34 compute-0 podman[218657]: 2025-12-10 10:29:34.396060132 +0000 UTC m=+0.089968934 container cleanup 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 10 10:29:34 compute-0 systemd[1]: libpod-conmon-47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e.scope: Deactivated successfully.
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.423 186993 INFO nova.virt.libvirt.driver [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Instance destroyed successfully.
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.423 186993 DEBUG nova.objects.instance [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.444 186993 DEBUG nova.virt.libvirt.vif [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:28:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-73136063',display_name='tempest-TestNetworkBasicOps-server-73136063',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-73136063',id=11,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBmKvYbT6bj4t57mDlrUSDmSjI3rzy8h8vceGx1Im7+MUXO41DcyBhm3Ct6eex1O+G5RLMqXn5sL8le7JyNPS5pNGQDvSbn/Ev+RDYQsNA6pKTEQis7SzRTupmQ5oaPFA==',key_name='tempest-TestNetworkBasicOps-1390815627',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:28:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-rzca3oo0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:28:37Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=9d70dd0a-d1e7-4821-a30b-0f11f1440ae5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.444 186993 DEBUG nova.network.os_vif_util [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.446 186993 DEBUG nova.network.os_vif_util [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.446 186993 DEBUG os_vif [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.448 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.449 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a577958-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.452 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.454 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.457 186993 INFO os_vif [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fb:ca:06,bridge_name='br-int',has_traffic_filtering=True,id=4a577958-715d-4941-ad19-f7f8b1cf8586,network=Network(88eae834-d1d3-4f81-a0f5-8439ceb543ad),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a577958-71')
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.458 186993 INFO nova.virt.libvirt.driver [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Deleting instance files /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5_del
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.459 186993 INFO nova.virt.libvirt.driver [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Deletion of /var/lib/nova/instances/9d70dd0a-d1e7-4821-a30b-0f11f1440ae5_del complete
Dec 10 10:29:34 compute-0 podman[218706]: 2025-12-10 10:29:34.476553862 +0000 UTC m=+0.047557133 container remove 47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.485 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[a47549c9-346e-4040-ba77-3adc4dd87a1a]: (4, ('Wed Dec 10 10:29:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad (47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e)\n47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e\nWed Dec 10 10:29:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad (47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e)\n47f122cb65f77627044229641340c06ba4d3f77e0ea9fc27ff19c6096a9c773e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.486 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[56eaa889-952e-4fcf-aa28-16a460905916]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.487 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88eae834-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.489 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 kernel: tap88eae834-d0: left promiscuous mode
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.511 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.516 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9eb8bc-e032-4aa6-b3c8-964269d31b1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.529 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[acf7063c-b181-4b74-b6e0-1db71fea5fe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.530 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a11a9e-81cf-4302-8e71-1ac083db6f71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.546 186993 INFO nova.compute.manager [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Took 0.39 seconds to destroy the instance on the hypervisor.
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.547 186993 DEBUG oslo.service.loopingcall [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.547 186993 DEBUG nova.compute.manager [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.547 186993 DEBUG nova.network.neutron [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.551 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4fe46c-5575-4e87-8543-102da8d8ae4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353165, 'reachable_time': 22546, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218722, 'error': None, 'target': 'ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.554 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88eae834-d1d3-4f81-a0f5-8439ceb543ad deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:29:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:29:34.554 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[2d53c994-c1d1-4652-bd71-8a24f768df1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:29:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d88eae834\x2dd1d3\x2d4f81\x2da0f5\x2d8439ceb543ad.mount: Deactivated successfully.
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.616 186993 DEBUG nova.compute.manager [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.616 186993 DEBUG oslo_concurrency.lockutils [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.617 186993 DEBUG oslo_concurrency.lockutils [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.617 186993 DEBUG oslo_concurrency.lockutils [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.617 186993 DEBUG nova.compute.manager [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:34 compute-0 nova_compute[186989]: 2025-12-10 10:29:34.617 186993 DEBUG nova.compute.manager [req-797d9980-7055-4d68-b1a9-4450efb988f2 req-38aff039-ba73-40f4-b484-8e833ea484d5 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-unplugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.804 186993 DEBUG nova.network.neutron [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.824 186993 INFO nova.compute.manager [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Took 1.28 seconds to deallocate network for instance.
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.882 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.882 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.939 186993 DEBUG nova.compute.provider_tree [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.954 186993 DEBUG nova.scheduler.client.report [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:29:35 compute-0 nova_compute[186989]: 2025-12-10 10:29:35.980 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.003 186993 INFO nova.scheduler.client.report [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5
Dec 10 10:29:36 compute-0 podman[218724]: 2025-12-10 10:29:36.076256221 +0000 UTC m=+0.107359884 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.vendor=CentOS)
Dec 10 10:29:36 compute-0 podman[218725]: 2025-12-10 10:29:36.076447446 +0000 UTC m=+0.112009482 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.083 186993 DEBUG oslo_concurrency.lockutils [None req-86645eef-e2f0-4a6a-a6b8-4057386ad5e2 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:36 compute-0 podman[218723]: 2025-12-10 10:29:36.083662975 +0000 UTC m=+0.116227609 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.134 186993 DEBUG nova.compute.manager [req-0906435b-45d9-4a0a-a5bb-877b09000dff req-caeb4af7-2834-441b-9d40-7c95201e5ca2 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-deleted-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.215 186993 DEBUG nova.network.neutron [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updated VIF entry in instance network info cache for port 4a577958-715d-4941-ad19-f7f8b1cf8586. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.215 186993 DEBUG nova.network.neutron [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Updating instance_info_cache with network_info: [{"id": "4a577958-715d-4941-ad19-f7f8b1cf8586", "address": "fa:16:3e:fb:ca:06", "network": {"id": "88eae834-d1d3-4f81-a0f5-8439ceb543ad", "bridge": "br-int", "label": "tempest-network-smoke--1643820045", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a577958-71", "ovs_interfaceid": "4a577958-715d-4941-ad19-f7f8b1cf8586", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.236 186993 DEBUG oslo_concurrency.lockutils [req-e0aa801d-e27e-45a0-8f2e-b3c79924f1bf req-f2731a8f-4375-4479-9d97-7a7c1495ffba 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-9d70dd0a-d1e7-4821-a30b-0f11f1440ae5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.713 186993 DEBUG nova.compute.manager [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.714 186993 DEBUG oslo_concurrency.lockutils [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.715 186993 DEBUG oslo_concurrency.lockutils [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.715 186993 DEBUG oslo_concurrency.lockutils [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "9d70dd0a-d1e7-4821-a30b-0f11f1440ae5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.716 186993 DEBUG nova.compute.manager [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] No waiting events found dispatching network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:29:36 compute-0 nova_compute[186989]: 2025-12-10 10:29:36.716 186993 WARNING nova.compute.manager [req-6ede7ab3-dc44-4e21-9461-b297143b7f2e req-808d0b2b-c05b-4548-8572-69665b88fc77 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Received unexpected event network-vif-plugged-4a577958-715d-4941-ad19-f7f8b1cf8586 for instance with vm_state deleted and task_state None.
Dec 10 10:29:37 compute-0 nova_compute[186989]: 2025-12-10 10:29:37.963 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:39 compute-0 nova_compute[186989]: 2025-12-10 10:29:39.452 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:40 compute-0 nova_compute[186989]: 2025-12-10 10:29:40.994 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:41 compute-0 nova_compute[186989]: 2025-12-10 10:29:41.104 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:42 compute-0 nova_compute[186989]: 2025-12-10 10:29:42.015 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362567.0133712, 4756e517-16f2-43b0-809d-2464cbd9e219 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:29:42 compute-0 nova_compute[186989]: 2025-12-10 10:29:42.016 186993 INFO nova.compute.manager [-] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] VM Stopped (Lifecycle Event)
Dec 10 10:29:42 compute-0 nova_compute[186989]: 2025-12-10 10:29:42.036 186993 DEBUG nova.compute.manager [None req-de3020cb-e7eb-446c-99a5-1700d722be6d - - - - - -] [instance: 4756e517-16f2-43b0-809d-2464cbd9e219] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:29:42 compute-0 nova_compute[186989]: 2025-12-10 10:29:42.966 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:44 compute-0 nova_compute[186989]: 2025-12-10 10:29:44.456 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:45 compute-0 podman[218787]: 2025-12-10 10:29:45.062194091 +0000 UTC m=+0.096152105 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_id=edpm, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 10 10:29:47 compute-0 podman[218808]: 2025-12-10 10:29:47.023144367 +0000 UTC m=+0.065752626 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:29:47 compute-0 nova_compute[186989]: 2025-12-10 10:29:47.968 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:48 compute-0 nova_compute[186989]: 2025-12-10 10:29:48.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:49 compute-0 nova_compute[186989]: 2025-12-10 10:29:49.422 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362574.4208665, 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:29:49 compute-0 nova_compute[186989]: 2025-12-10 10:29:49.422 186993 INFO nova.compute.manager [-] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] VM Stopped (Lifecycle Event)
Dec 10 10:29:49 compute-0 nova_compute[186989]: 2025-12-10 10:29:49.455 186993 DEBUG nova.compute.manager [None req-cfe9f1d6-c7b9-4af2-9afb-90b669483022 - - - - - -] [instance: 9d70dd0a-d1e7-4821-a30b-0f11f1440ae5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:29:49 compute-0 nova_compute[186989]: 2025-12-10 10:29:49.459 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:49 compute-0 nova_compute[186989]: 2025-12-10 10:29:49.917 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:52 compute-0 nova_compute[186989]: 2025-12-10 10:29:52.970 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:53 compute-0 nova_compute[186989]: 2025-12-10 10:29:53.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:53 compute-0 nova_compute[186989]: 2025-12-10 10:29:53.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:53 compute-0 nova_compute[186989]: 2025-12-10 10:29:53.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:29:54 compute-0 nova_compute[186989]: 2025-12-10 10:29:54.463 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.934 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.935 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:56 compute-0 nova_compute[186989]: 2025-12-10 10:29:56.935 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:57 compute-0 nova_compute[186989]: 2025-12-10 10:29:57.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:57 compute-0 nova_compute[186989]: 2025-12-10 10:29:57.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:57 compute-0 nova_compute[186989]: 2025-12-10 10:29:57.972 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:59 compute-0 podman[218833]: 2025-12-10 10:29:59.036591209 +0000 UTC m=+0.072064309 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.467 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.951 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.951 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:29:59 compute-0 nova_compute[186989]: 2025-12-10 10:29:59.952 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:30:00 compute-0 podman[218857]: 2025-12-10 10:30:00.048208738 +0000 UTC m=+0.079314910 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.149 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.150 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5749MB free_disk=73.32890701293945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.150 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.150 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.235 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.236 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.260 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.276 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.298 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:30:00 compute-0 nova_compute[186989]: 2025-12-10 10:30:00.298 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:03 compute-0 nova_compute[186989]: 2025-12-10 10:30:03.159 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:04 compute-0 nova_compute[186989]: 2025-12-10 10:30:04.470 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.878 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.878 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.897 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.968 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.968 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.974 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 10 10:30:06 compute-0 nova_compute[186989]: 2025-12-10 10:30:06.974 186993 INFO nova.compute.claims [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Claim successful on node compute-0.ctlplane.example.com
Dec 10 10:30:07 compute-0 podman[218878]: 2025-12-10 10:30:07.048175804 +0000 UTC m=+0.076771654 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 10 10:30:07 compute-0 podman[218877]: 2025-12-10 10:30:07.057410572 +0000 UTC m=+0.095121786 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 10 10:30:07 compute-0 podman[218879]: 2025-12-10 10:30:07.085552408 +0000 UTC m=+0.113798547 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.087 186993 DEBUG nova.compute.provider_tree [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.101 186993 DEBUG nova.scheduler.client.report [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.118 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.119 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.164 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.165 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.181 186993 INFO nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.199 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.282 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.284 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.285 186993 INFO nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Creating image(s)
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.286 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.286 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.287 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.311 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.395 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.397 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.398 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.413 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.469 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.471 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.525 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1,backing_fmt=raw /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.527 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "fce84dba217ab0844b56fdcc1482691d16d7d8a1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.528 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.584 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/fce84dba217ab0844b56fdcc1482691d16d7d8a1 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.585 186993 DEBUG nova.virt.disk.api [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Checking if we can resize image /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.585 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.610 186993 DEBUG nova.policy [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '603f9c3a99e145e4a64248329321a249', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82da19f85bb840d2a70395c3d761ef38', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.640 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.642 186993 DEBUG nova.virt.disk.api [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Cannot resize image /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.642 186993 DEBUG nova.objects.instance [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'migration_context' on Instance uuid cebdf629-e283-44bf-9a2a-1514afba22b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.663 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.664 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Ensure instance console log exists: /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.664 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.664 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:07 compute-0 nova_compute[186989]: 2025-12-10 10:30:07.665 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:08 compute-0 nova_compute[186989]: 2025-12-10 10:30:08.160 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:08 compute-0 nova_compute[186989]: 2025-12-10 10:30:08.557 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Successfully created port: 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 10 10:30:09 compute-0 nova_compute[186989]: 2025-12-10 10:30:09.473 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:09 compute-0 nova_compute[186989]: 2025-12-10 10:30:09.885 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Successfully updated port: 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 10 10:30:09 compute-0 nova_compute[186989]: 2025-12-10 10:30:09.938 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:30:09 compute-0 nova_compute[186989]: 2025-12-10 10:30:09.939 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquired lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:30:09 compute-0 nova_compute[186989]: 2025-12-10 10:30:09.939 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 10 10:30:10 compute-0 nova_compute[186989]: 2025-12-10 10:30:10.079 186993 DEBUG nova.compute.manager [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:10 compute-0 nova_compute[186989]: 2025-12-10 10:30:10.080 186993 DEBUG nova.compute.manager [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing instance network info cache due to event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:30:10 compute-0 nova_compute[186989]: 2025-12-10 10:30:10.080 186993 DEBUG oslo_concurrency.lockutils [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:30:10 compute-0 nova_compute[186989]: 2025-12-10 10:30:10.151 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.235 186993 DEBUG nova.network.neutron [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updating instance_info_cache with network_info: [{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.259 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Releasing lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.260 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Instance network_info: |[{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.260 186993 DEBUG oslo_concurrency.lockutils [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.261 186993 DEBUG nova.network.neutron [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.266 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Start _get_guest_xml network_info=[{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'encrypted': False, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_type': 'disk', 'device_name': '/dev/vda', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'image_id': 'db4e7c9d-c1ff-44a9-9cd7-57ab019e9474'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.274 186993 WARNING nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.279 186993 DEBUG nova.virt.libvirt.host [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.280 186993 DEBUG nova.virt.libvirt.host [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.289 186993 DEBUG nova.virt.libvirt.host [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.290 186993 DEBUG nova.virt.libvirt.host [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.291 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.291 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-10T10:19:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6f9bf686-c5d3-4e9c-a944-269864569e67',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-10T10:19:56Z,direct_url=<?>,disk_format='qcow2',id=db4e7c9d-c1ff-44a9-9cd7-57ab019e9474,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c2231ca0f94b4b4fa9b96f4406a080b9',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-10T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.291 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.292 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.292 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.292 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.292 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.293 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.293 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.293 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.293 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.295 186993 DEBUG nova.virt.hardware [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.305 186993 DEBUG nova.virt.libvirt.vif [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:30:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-88465170',display_name='tempest-TestNetworkBasicOps-server-88465170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-88465170',id=13,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH12OsC6l+BK+3BdHFQ2URxLIaiGsn4LygeiL3VxFFfe6CQFkRDS/QEafYDMHjWQ6VeoBF+WSxoFA76M/WA3Tc1QjssBcprA8xf3ad+D/e1vRO9Y6FZt9lFcDz7kSwDDeA==',key_name='tempest-TestNetworkBasicOps-1955377106',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-v3299kmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:30:07Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=cebdf629-e283-44bf-9a2a-1514afba22b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.306 186993 DEBUG nova.network.os_vif_util [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.307 186993 DEBUG nova.network.os_vif_util [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.309 186993 DEBUG nova.objects.instance [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'pci_devices' on Instance uuid cebdf629-e283-44bf-9a2a-1514afba22b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.325 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] End _get_guest_xml xml=<domain type="kvm">
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <uuid>cebdf629-e283-44bf-9a2a-1514afba22b3</uuid>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <name>instance-0000000d</name>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <memory>131072</memory>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <vcpu>1</vcpu>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <metadata>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:name>tempest-TestNetworkBasicOps-server-88465170</nova:name>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:creationTime>2025-12-10 10:30:11</nova:creationTime>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:flavor name="m1.nano">
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:memory>128</nova:memory>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:disk>1</nova:disk>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:swap>0</nova:swap>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:ephemeral>0</nova:ephemeral>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:vcpus>1</nova:vcpus>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       </nova:flavor>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:owner>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:user uuid="603f9c3a99e145e4a64248329321a249">tempest-TestNetworkBasicOps-319431412-project-member</nova:user>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:project uuid="82da19f85bb840d2a70395c3d761ef38">tempest-TestNetworkBasicOps-319431412</nova:project>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       </nova:owner>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:root type="image" uuid="db4e7c9d-c1ff-44a9-9cd7-57ab019e9474"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <nova:ports>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         <nova:port uuid="00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3">
Dec 10 10:30:11 compute-0 nova_compute[186989]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:         </nova:port>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       </nova:ports>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </nova:instance>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </metadata>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <sysinfo type="smbios">
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <system>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="manufacturer">RDO</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="product">OpenStack Compute</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="serial">cebdf629-e283-44bf-9a2a-1514afba22b3</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="uuid">cebdf629-e283-44bf-9a2a-1514afba22b3</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <entry name="family">Virtual Machine</entry>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </system>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </sysinfo>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <os>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <boot dev="hd"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <smbios mode="sysinfo"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </os>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <features>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <acpi/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <apic/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <vmcoreinfo/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </features>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <clock offset="utc">
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <timer name="pit" tickpolicy="delay"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <timer name="hpet" present="no"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </clock>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <cpu mode="host-model" match="exact">
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <topology sockets="1" cores="1" threads="1"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </cpu>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   <devices>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <disk type="file" device="disk">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <target dev="vda" bus="virtio"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <disk type="file" device="cdrom">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <driver name="qemu" type="raw" cache="none"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <source file="/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.config"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <target dev="sda" bus="sata"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </disk>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <interface type="ethernet">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <mac address="fa:16:3e:f9:18:e2"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <driver name="vhost" rx_queue_size="512"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <mtu size="1442"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <target dev="tap00a1f903-cf"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </interface>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <serial type="pty">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <log file="/var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/console.log" append="off"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </serial>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <video>
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <model type="virtio"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </video>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <input type="tablet" bus="usb"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <rng model="virtio">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <backend model="random">/dev/urandom</backend>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </rng>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="pci" model="pcie-root-port"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <controller type="usb" index="0"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     <memballoon model="virtio">
Dec 10 10:30:11 compute-0 nova_compute[186989]:       <stats period="10"/>
Dec 10 10:30:11 compute-0 nova_compute[186989]:     </memballoon>
Dec 10 10:30:11 compute-0 nova_compute[186989]:   </devices>
Dec 10 10:30:11 compute-0 nova_compute[186989]: </domain>
Dec 10 10:30:11 compute-0 nova_compute[186989]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.326 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Preparing to wait for external event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.327 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.327 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.328 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.329 186993 DEBUG nova.virt.libvirt.vif [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-10T10:30:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-88465170',display_name='tempest-TestNetworkBasicOps-server-88465170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-88465170',id=13,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH12OsC6l+BK+3BdHFQ2URxLIaiGsn4LygeiL3VxFFfe6CQFkRDS/QEafYDMHjWQ6VeoBF+WSxoFA76M/WA3Tc1QjssBcprA8xf3ad+D/e1vRO9Y6FZt9lFcDz7kSwDDeA==',key_name='tempest-TestNetworkBasicOps-1955377106',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-v3299kmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-10T10:30:07Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=cebdf629-e283-44bf-9a2a-1514afba22b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.330 186993 DEBUG nova.network.os_vif_util [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.331 186993 DEBUG nova.network.os_vif_util [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.332 186993 DEBUG os_vif [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.333 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.334 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.335 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.340 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.341 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00a1f903-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.342 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap00a1f903-cf, col_values=(('external_ids', {'iface-id': '00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:18:e2', 'vm-uuid': 'cebdf629-e283-44bf-9a2a-1514afba22b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.344 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:11 compute-0 NetworkManager[55541]: <info>  [1765362611.3451] manager: (tap00a1f903-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.347 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.350 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.350 186993 INFO os_vif [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf')
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.397 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.397 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.397 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] No VIF found with MAC fa:16:3e:f9:18:e2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.398 186993 INFO nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Using config drive
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.832 186993 INFO nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Creating config drive at /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.config
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.844 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9crv1k68 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:30:11 compute-0 nova_compute[186989]: 2025-12-10 10:30:11.983 186993 DEBUG oslo_concurrency.processutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9crv1k68" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:30:12 compute-0 kernel: tap00a1f903-cf: entered promiscuous mode
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.0476] manager: (tap00a1f903-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.048 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 ovn_controller[95452]: 2025-12-10T10:30:12Z|00156|binding|INFO|Claiming lport 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 for this chassis.
Dec 10 10:30:12 compute-0 ovn_controller[95452]: 2025-12-10T10:30:12Z|00157|binding|INFO|00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3: Claiming fa:16:3e:f9:18:e2 10.100.0.11
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.055 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.063 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:e2 10.100.0.11'], port_security=['fa:16:3e:f9:18:e2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cebdf629-e283-44bf-9a2a-1514afba22b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '2', 'neutron:security_group_ids': '874d554e-2ed4-4162-baeb-e2c60ba33f42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3996e8-f011-4f43-8454-610c6e0c021a, chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.064 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 in datapath 5bd242ab-51b6-45e4-8b1d-02223f35355a bound to our chassis
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.065 104302 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bd242ab-51b6-45e4-8b1d-02223f35355a
Dec 10 10:30:12 compute-0 systemd-udevd[218974]: Network interface NamePolicy= disabled on kernel command line.
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.077 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe71f39-8bd9-4a57-89ef-053e0ba545a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.078 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bd242ab-51 in ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.080 213247 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bd242ab-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.080 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[01e09f38-3117-49c9-a9d1-c582fb31d1f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.080 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[738e8282-ab91-416f-91c7-206c63331a06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 systemd-machined[153379]: New machine qemu-13-instance-0000000d.
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.0871] device (tap00a1f903-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.0885] device (tap00a1f903-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.090 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[faf3b23f-c228-4179-a8a9-48fe8c5e5bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.105 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Dec 10 10:30:12 compute-0 ovn_controller[95452]: 2025-12-10T10:30:12Z|00158|binding|INFO|Setting lport 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 ovn-installed in OVS
Dec 10 10:30:12 compute-0 ovn_controller[95452]: 2025-12-10T10:30:12Z|00159|binding|INFO|Setting lport 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 up in Southbound
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.109 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.115 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[aed260d1-b428-4979-bdee-65876b36c8d1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.143 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[600e5a4f-4104-4d4f-81cf-e1d1a6cd1058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.1489] manager: (tap5bd242ab-50): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.147 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[4e65b1ae-e97a-4a78-b9fd-b7674c0d5980]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.182 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[f6088233-ea02-4bb2-bca8-8997734f5132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.185 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[c4477d19-a3a3-4898-bcf0-337bbe4fad69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.2072] device (tap5bd242ab-50): carrier: link connected
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.212 213279 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8b94bc-452e-4f77-a40f-f43f6145ca7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.229 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[356c0eaa-f315-46f2-8a13-1e0a811b7cb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bd242ab-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362742, 'reachable_time': 18653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219008, 'error': None, 'target': 'ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.244 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[2564364e-8f30-448c-81a7-b54270e1765d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5b:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362742, 'tstamp': 362742}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219010, 'error': None, 'target': 'ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.259 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[5368457a-88dd-4ec0-acd0-be6a310f5916]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bd242ab-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5b:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362742, 'reachable_time': 18653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219011, 'error': None, 'target': 'ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.285 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[edb6446a-fd1c-4754-bd69-76ae0aa0f497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.326 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c14d23ce-5b77-47f2-8703-73c4f58cf77b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.328 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bd242ab-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.328 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.329 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bd242ab-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.330 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 kernel: tap5bd242ab-50: entered promiscuous mode
Dec 10 10:30:12 compute-0 NetworkManager[55541]: <info>  [1765362612.3313] manager: (tap5bd242ab-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.334 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bd242ab-50, col_values=(('external_ids', {'iface-id': 'b3ec78e5-d93f-459b-8861-11481e32717d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:12 compute-0 ovn_controller[95452]: 2025-12-10T10:30:12Z|00160|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.335 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.348 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.349 104302 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bd242ab-51b6-45e4-8b1d-02223f35355a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bd242ab-51b6-45e4-8b1d-02223f35355a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.350 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c05ba488-e69a-43a3-a6dd-55d051b38195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.351 104302 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: global
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     log         /dev/log local0 debug
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     log-tag     haproxy-metadata-proxy-5bd242ab-51b6-45e4-8b1d-02223f35355a
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     user        root
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     group       root
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     maxconn     1024
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     pidfile     /var/lib/neutron/external/pids/5bd242ab-51b6-45e4-8b1d-02223f35355a.pid.haproxy
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     daemon
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: defaults
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     log global
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     mode http
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     option httplog
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     option dontlognull
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     option http-server-close
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     option forwardfor
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     retries                 3
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     timeout http-request    30s
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     timeout connect         30s
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     timeout client          32s
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     timeout server          32s
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     timeout http-keep-alive 30s
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: listen listener
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     bind 169.254.169.254:80
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     server metadata /var/lib/neutron/metadata_proxy
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:     http-request add-header X-OVN-Network-ID 5bd242ab-51b6-45e4-8b1d-02223f35355a
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 10 10:30:12 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:12.352 104302 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'env', 'PROCESS_TAG=haproxy-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bd242ab-51b6-45e4-8b1d-02223f35355a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.377 186993 DEBUG nova.compute.manager [req-59b94407-2745-4988-821d-1b750426198f req-f40fc78b-a084-41b9-99e8-962e7cdc08b7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.378 186993 DEBUG oslo_concurrency.lockutils [req-59b94407-2745-4988-821d-1b750426198f req-f40fc78b-a084-41b9-99e8-962e7cdc08b7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.379 186993 DEBUG oslo_concurrency.lockutils [req-59b94407-2745-4988-821d-1b750426198f req-f40fc78b-a084-41b9-99e8-962e7cdc08b7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.379 186993 DEBUG oslo_concurrency.lockutils [req-59b94407-2745-4988-821d-1b750426198f req-f40fc78b-a084-41b9-99e8-962e7cdc08b7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.379 186993 DEBUG nova.compute.manager [req-59b94407-2745-4988-821d-1b750426198f req-f40fc78b-a084-41b9-99e8-962e7cdc08b7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Processing event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.658 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.659 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362612.6587086, cebdf629-e283-44bf-9a2a-1514afba22b3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.660 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] VM Started (Lifecycle Event)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.665 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.668 186993 INFO nova.virt.libvirt.driver [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Instance spawned successfully.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.668 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.687 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.691 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.695 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.696 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.696 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.697 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.697 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.698 186993 DEBUG nova.virt.libvirt.driver [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 10 10:30:12 compute-0 podman[219050]: 2025-12-10 10:30:12.701359659 +0000 UTC m=+0.053126118 container create 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.723 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.724 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362612.6588967, cebdf629-e283-44bf-9a2a-1514afba22b3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.725 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] VM Paused (Lifecycle Event)
Dec 10 10:30:12 compute-0 systemd[1]: Started libpod-conmon-6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3.scope.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.757 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.761 186993 DEBUG nova.virt.driver [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] Emitting event <LifecycleEvent: 1765362612.6641996, cebdf629-e283-44bf-9a2a-1514afba22b3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.762 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] VM Resumed (Lifecycle Event)
Dec 10 10:30:12 compute-0 podman[219050]: 2025-12-10 10:30:12.668426774 +0000 UTC m=+0.020193253 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.766 186993 INFO nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Took 5.48 seconds to spawn the instance on the hypervisor.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.767 186993 DEBUG nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:30:12 compute-0 systemd[1]: Started libcrun container.
Dec 10 10:30:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a606d23be616ef25e4fe6fd6f973c1936b12c6eeecbce192b4080aa8e0d5e565/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 10 10:30:12 compute-0 podman[219050]: 2025-12-10 10:30:12.799363012 +0000 UTC m=+0.151129571 container init 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.800 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:30:12 compute-0 podman[219050]: 2025-12-10 10:30:12.804530141 +0000 UTC m=+0.156296630 container start 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.804 186993 DEBUG nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 10 10:30:12 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [NOTICE]   (219069) : New worker (219071) forked
Dec 10 10:30:12 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [NOTICE]   (219069) : Loading success.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.833 186993 INFO nova.compute.manager [None req-fb4a15aa-e4b9-4130-ba64-41c6a6c0d811 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.862 186993 INFO nova.compute.manager [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Took 5.92 seconds to build instance.
Dec 10 10:30:12 compute-0 nova_compute[186989]: 2025-12-10 10:30:12.908 186993 DEBUG oslo_concurrency.lockutils [None req-2e76654c-d48d-43fd-8f6d-e64ca08ec704 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:13 compute-0 nova_compute[186989]: 2025-12-10 10:30:13.162 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:13 compute-0 nova_compute[186989]: 2025-12-10 10:30:13.593 186993 DEBUG nova.network.neutron [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updated VIF entry in instance network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:30:13 compute-0 nova_compute[186989]: 2025-12-10 10:30:13.594 186993 DEBUG nova.network.neutron [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updating instance_info_cache with network_info: [{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:30:13 compute-0 nova_compute[186989]: 2025-12-10 10:30:13.628 186993 DEBUG oslo_concurrency.lockutils [req-e5810789-93c5-4c87-b866-e60e472cf64a req-f50aa6f2-981d-4c74-8c8f-2c9e7956d172 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.608 186993 DEBUG nova.compute.manager [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.609 186993 DEBUG oslo_concurrency.lockutils [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.609 186993 DEBUG oslo_concurrency.lockutils [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.610 186993 DEBUG oslo_concurrency.lockutils [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.610 186993 DEBUG nova.compute.manager [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] No waiting events found dispatching network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:30:14 compute-0 nova_compute[186989]: 2025-12-10 10:30:14.610 186993 WARNING nova.compute.manager [req-ba1d8c29-ea01-432a-a240-9addbd0e01c0 req-0e1c5121-b5e2-4591-8cbf-42f106acaf66 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received unexpected event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 for instance with vm_state active and task_state None.
Dec 10 10:30:16 compute-0 podman[219080]: 2025-12-10 10:30:16.023473923 +0000 UTC m=+0.069765065 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Dec 10 10:30:16 compute-0 nova_compute[186989]: 2025-12-10 10:30:16.346 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:18 compute-0 podman[219102]: 2025-12-10 10:30:18.021639011 +0000 UTC m=+0.063064475 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:30:18 compute-0 nova_compute[186989]: 2025-12-10 10:30:18.164 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:19 compute-0 nova_compute[186989]: 2025-12-10 10:30:19.731 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:19 compute-0 ovn_controller[95452]: 2025-12-10T10:30:19Z|00161|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:19 compute-0 NetworkManager[55541]: <info>  [1765362619.7347] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 10 10:30:19 compute-0 NetworkManager[55541]: <info>  [1765362619.7361] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 10 10:30:19 compute-0 ovn_controller[95452]: 2025-12-10T10:30:19Z|00162|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:19 compute-0 nova_compute[186989]: 2025-12-10 10:30:19.775 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:19 compute-0 nova_compute[186989]: 2025-12-10 10:30:19.781 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:20 compute-0 nova_compute[186989]: 2025-12-10 10:30:20.389 186993 DEBUG nova.compute.manager [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:20 compute-0 nova_compute[186989]: 2025-12-10 10:30:20.389 186993 DEBUG nova.compute.manager [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing instance network info cache due to event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:30:20 compute-0 nova_compute[186989]: 2025-12-10 10:30:20.390 186993 DEBUG oslo_concurrency.lockutils [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:30:20 compute-0 nova_compute[186989]: 2025-12-10 10:30:20.390 186993 DEBUG oslo_concurrency.lockutils [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:30:20 compute-0 nova_compute[186989]: 2025-12-10 10:30:20.390 186993 DEBUG nova.network.neutron [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:30:21 compute-0 nova_compute[186989]: 2025-12-10 10:30:21.396 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:21 compute-0 nova_compute[186989]: 2025-12-10 10:30:21.889 186993 DEBUG nova.network.neutron [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updated VIF entry in instance network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:30:21 compute-0 nova_compute[186989]: 2025-12-10 10:30:21.890 186993 DEBUG nova.network.neutron [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updating instance_info_cache with network_info: [{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:30:21 compute-0 nova_compute[186989]: 2025-12-10 10:30:21.924 186993 DEBUG oslo_concurrency.lockutils [req-e8db1b9c-71bc-4c44-9f0a-5e565cd5ca36 req-b7c2b8bf-29b0-4e2c-b3f5-73333c7b652f 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:30:23 compute-0 nova_compute[186989]: 2025-12-10 10:30:23.167 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:24 compute-0 ovn_controller[95452]: 2025-12-10T10:30:24Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f9:18:e2 10.100.0.11
Dec 10 10:30:24 compute-0 ovn_controller[95452]: 2025-12-10T10:30:24Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f9:18:e2 10.100.0.11
Dec 10 10:30:26 compute-0 nova_compute[186989]: 2025-12-10 10:30:26.399 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:28 compute-0 nova_compute[186989]: 2025-12-10 10:30:28.170 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:30 compute-0 podman[219140]: 2025-12-10 10:30:30.036830053 +0000 UTC m=+0.075641713 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:30:30 compute-0 nova_compute[186989]: 2025-12-10 10:30:30.639 186993 INFO nova.compute.manager [None req-4c2acc05-753a-4517-a30f-33f3c211b99d 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Get console output
Dec 10 10:30:30 compute-0 nova_compute[186989]: 2025-12-10 10:30:30.645 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:30:31 compute-0 podman[219165]: 2025-12-10 10:30:31.010571763 +0000 UTC m=+0.056561761 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 10 10:30:31 compute-0 nova_compute[186989]: 2025-12-10 10:30:31.401 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:31.473 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:31.473 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:31.474 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:31 compute-0 ovn_controller[95452]: 2025-12-10T10:30:31Z|00163|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:31 compute-0 nova_compute[186989]: 2025-12-10 10:30:31.822 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:31 compute-0 ovn_controller[95452]: 2025-12-10T10:30:31Z|00164|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:31 compute-0 nova_compute[186989]: 2025-12-10 10:30:31.908 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:33 compute-0 nova_compute[186989]: 2025-12-10 10:30:33.104 186993 INFO nova.compute.manager [None req-b6845ba2-ac69-4134-9ff6-b49bc23ffccb 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Get console output
Dec 10 10:30:33 compute-0 nova_compute[186989]: 2025-12-10 10:30:33.111 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:30:33 compute-0 nova_compute[186989]: 2025-12-10 10:30:33.172 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.213 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:34 compute-0 NetworkManager[55541]: <info>  [1765362634.2147] manager: (patch-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Dec 10 10:30:34 compute-0 NetworkManager[55541]: <info>  [1765362634.2155] manager: (patch-br-int-to-provnet-cde551b2-40a3-4cb6-a1b2-2830ce22206d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.281 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:34 compute-0 ovn_controller[95452]: 2025-12-10T10:30:34Z|00165|binding|INFO|Releasing lport b3ec78e5-d93f-459b-8861-11481e32717d from this chassis (sb_readonly=0)
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.290 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:34.827 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:30:34 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:34.828 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.832 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.902 186993 INFO nova.compute.manager [None req-d2be94b3-9849-44b1-b56e-a49c5ab2dc1e 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Get console output
Dec 10 10:30:34 compute-0 nova_compute[186989]: 2025-12-10 10:30:34.908 213152 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 10 10:30:35 compute-0 nova_compute[186989]: 2025-12-10 10:30:35.985 186993 DEBUG nova.compute.manager [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:35 compute-0 nova_compute[186989]: 2025-12-10 10:30:35.985 186993 DEBUG nova.compute.manager [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing instance network info cache due to event network-changed-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 10 10:30:35 compute-0 nova_compute[186989]: 2025-12-10 10:30:35.986 186993 DEBUG oslo_concurrency.lockutils [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 10 10:30:35 compute-0 nova_compute[186989]: 2025-12-10 10:30:35.986 186993 DEBUG oslo_concurrency.lockutils [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquired lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 10 10:30:35 compute-0 nova_compute[186989]: 2025-12-10 10:30:35.986 186993 DEBUG nova.network.neutron [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Refreshing network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.033 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.034 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.034 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.035 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.035 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.036 186993 INFO nova.compute.manager [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Terminating instance
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.037 186993 DEBUG nova.compute.manager [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 10 10:30:36 compute-0 kernel: tap00a1f903-cf (unregistering): left promiscuous mode
Dec 10 10:30:36 compute-0 NetworkManager[55541]: <info>  [1765362636.0606] device (tap00a1f903-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 10 10:30:36 compute-0 ovn_controller[95452]: 2025-12-10T10:30:36Z|00166|binding|INFO|Releasing lport 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 from this chassis (sb_readonly=0)
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.101 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 ovn_controller[95452]: 2025-12-10T10:30:36Z|00167|binding|INFO|Setting lport 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 down in Southbound
Dec 10 10:30:36 compute-0 ovn_controller[95452]: 2025-12-10T10:30:36Z|00168|binding|INFO|Removing iface tap00a1f903-cf ovn-installed in OVS
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.112 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:e2 10.100.0.11'], port_security=['fa:16:3e:f9:18:e2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cebdf629-e283-44bf-9a2a-1514afba22b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '82da19f85bb840d2a70395c3d761ef38', 'neutron:revision_number': '4', 'neutron:security_group_ids': '874d554e-2ed4-4162-baeb-e2c60ba33f42', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3996e8-f011-4f43-8454-610c6e0c021a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>], logical_port=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f84603c0790>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.114 104302 INFO neutron.agent.ovn.metadata.agent [-] Port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 in datapath 5bd242ab-51b6-45e4-8b1d-02223f35355a unbound from our chassis
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.115 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.116 104302 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bd242ab-51b6-45e4-8b1d-02223f35355a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.117 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[bd82d5aa-28b5-4b93-b54f-09ace63ef630]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.118 104302 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a namespace which is not needed anymore
Dec 10 10:30:36 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 10 10:30:36 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.745s CPU time.
Dec 10 10:30:36 compute-0 systemd-machined[153379]: Machine qemu-13-instance-0000000d terminated.
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [NOTICE]   (219069) : haproxy version is 2.8.14-c23fe91
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [NOTICE]   (219069) : path to executable is /usr/sbin/haproxy
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [WARNING]  (219069) : Exiting Master process...
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [WARNING]  (219069) : Exiting Master process...
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [ALERT]    (219069) : Current worker (219071) exited with code 143 (Terminated)
Dec 10 10:30:36 compute-0 neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a[219065]: [WARNING]  (219069) : All workers exited. Exiting... (0)
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.263 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 systemd[1]: libpod-6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3.scope: Deactivated successfully.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.268 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 podman[219210]: 2025-12-10 10:30:36.2711113 +0000 UTC m=+0.048937146 container died 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 10 10:30:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-a606d23be616ef25e4fe6fd6f973c1936b12c6eeecbce192b4080aa8e0d5e565-merged.mount: Deactivated successfully.
Dec 10 10:30:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3-userdata-shm.mount: Deactivated successfully.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.304 186993 INFO nova.virt.libvirt.driver [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Instance destroyed successfully.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.305 186993 DEBUG nova.objects.instance [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lazy-loading 'resources' on Instance uuid cebdf629-e283-44bf-9a2a-1514afba22b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 10 10:30:36 compute-0 podman[219210]: 2025-12-10 10:30:36.310795035 +0000 UTC m=+0.088620871 container cleanup 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:30:36 compute-0 systemd[1]: libpod-conmon-6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3.scope: Deactivated successfully.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.404 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.459 186993 DEBUG nova.virt.libvirt.vif [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-10T10:30:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-88465170',display_name='tempest-TestNetworkBasicOps-server-88465170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-88465170',id=13,image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH12OsC6l+BK+3BdHFQ2URxLIaiGsn4LygeiL3VxFFfe6CQFkRDS/QEafYDMHjWQ6VeoBF+WSxoFA76M/WA3Tc1QjssBcprA8xf3ad+D/e1vRO9Y6FZt9lFcDz7kSwDDeA==',key_name='tempest-TestNetworkBasicOps-1955377106',keypairs=<?>,launch_index=0,launched_at=2025-12-10T10:30:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='82da19f85bb840d2a70395c3d761ef38',ramdisk_id='',reservation_id='r-v3299kmn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='db4e7c9d-c1ff-44a9-9cd7-57ab019e9474',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-319431412',owner_user_name='tempest-TestNetworkBasicOps-319431412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-10T10:30:12Z,user_data=None,user_id='603f9c3a99e145e4a64248329321a249',uuid=cebdf629-e283-44bf-9a2a-1514afba22b3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.459 186993 DEBUG nova.network.os_vif_util [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converting VIF {"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.461 186993 DEBUG nova.network.os_vif_util [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.461 186993 DEBUG os_vif [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.463 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.464 186993 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00a1f903-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.465 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.467 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.469 186993 INFO os_vif [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:e2,bridge_name='br-int',has_traffic_filtering=True,id=00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3,network=Network(5bd242ab-51b6-45e4-8b1d-02223f35355a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap00a1f903-cf')
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.470 186993 INFO nova.virt.libvirt.driver [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Deleting instance files /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3_del
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.471 186993 INFO nova.virt.libvirt.driver [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Deletion of /var/lib/nova/instances/cebdf629-e283-44bf-9a2a-1514afba22b3_del complete
Dec 10 10:30:36 compute-0 podman[219253]: 2025-12-10 10:30:36.491538361 +0000 UTC m=+0.160628527 container remove 6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.501 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7ac27e-8096-4c41-90d6-1e6bb59a9061]: (4, ('Wed Dec 10 10:30:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a (6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3)\n6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3\nWed Dec 10 10:30:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a (6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3)\n6e68e6f41a044663c2db2042d1feccde40387bf7bb633828d8c384db9bb4b0b3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.503 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[f4994db3-b82b-45a2-ace0-47a166590d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.504 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bd242ab-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.507 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 kernel: tap5bd242ab-50: left promiscuous mode
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.523 186993 INFO nova.compute.manager [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Took 0.49 seconds to destroy the instance on the hypervisor.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.524 186993 DEBUG oslo.service.loopingcall [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.525 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.526 186993 DEBUG nova.compute.manager [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.527 186993 DEBUG nova.network.neutron [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.526 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[ed22d14e-b63f-41ab-94cd-159ed2511ea7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.544 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[111aea43-159a-4b3e-974a-7388fb143bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.545 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[c334ccd7-ba66-4f99-9964-25c20f33df38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.561 213247 DEBUG oslo.privsep.daemon [-] privsep: reply[de7f758a-b5ec-4099-910f-71bae97c7c48]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362735, 'reachable_time': 33896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219271, 'error': None, 'target': 'ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.566 104414 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bd242ab-51b6-45e4-8b1d-02223f35355a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 10 10:30:36 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:36.566 104414 DEBUG oslo.privsep.daemon [-] privsep: reply[3469327d-e071-4b75-97b9-d20359bb1de3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 10 10:30:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d5bd242ab\x2d51b6\x2d45e4\x2d8b1d\x2d02223f35355a.mount: Deactivated successfully.
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.750 186993 DEBUG nova.compute.manager [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-unplugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.750 186993 DEBUG oslo_concurrency.lockutils [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.750 186993 DEBUG oslo_concurrency.lockutils [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.751 186993 DEBUG oslo_concurrency.lockutils [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.751 186993 DEBUG nova.compute.manager [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] No waiting events found dispatching network-vif-unplugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:30:36 compute-0 nova_compute[186989]: 2025-12-10 10:30:36.751 186993 DEBUG nova.compute.manager [req-38fb379a-8130-4ec6-a6fe-e7f3ec3c7b25 req-54a78dda-9116-4d37-86f3-d01992686a3e 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-unplugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.138 186993 DEBUG nova.network.neutron [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.159 186993 INFO nova.compute.manager [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Took 0.63 seconds to deallocate network for instance.
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.210 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.211 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.291 186993 DEBUG nova.compute.provider_tree [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.308 186993 DEBUG nova.scheduler.client.report [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.376 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.408 186993 INFO nova.scheduler.client.report [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Deleted allocations for instance cebdf629-e283-44bf-9a2a-1514afba22b3
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.487 186993 DEBUG oslo_concurrency.lockutils [None req-c82a96d1-f8c8-47a0-b94f-3d4e61b1c062 603f9c3a99e145e4a64248329321a249 82da19f85bb840d2a70395c3d761ef38 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.975 186993 DEBUG nova.network.neutron [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updated VIF entry in instance network info cache for port 00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.976 186993 DEBUG nova.network.neutron [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Updating instance_info_cache with network_info: [{"id": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "address": "fa:16:3e:f9:18:e2", "network": {"id": "5bd242ab-51b6-45e4-8b1d-02223f35355a", "bridge": "br-int", "label": "tempest-network-smoke--1541010426", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "82da19f85bb840d2a70395c3d761ef38", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap00a1f903-cf", "ovs_interfaceid": "00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 10 10:30:37 compute-0 nova_compute[186989]: 2025-12-10 10:30:37.996 186993 DEBUG oslo_concurrency.lockutils [req-69c766f1-c3c1-41ec-8621-8e95a8b644ec req-41b774a6-6e3f-4d7f-ad37-baf9ad4279ac 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Releasing lock "refresh_cache-cebdf629-e283-44bf-9a2a-1514afba22b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 10 10:30:38 compute-0 podman[219273]: 2025-12-10 10:30:38.024790649 +0000 UTC m=+0.061758770 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 10 10:30:38 compute-0 podman[219272]: 2025-12-10 10:30:38.030251386 +0000 UTC m=+0.072715474 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.072 186993 DEBUG nova.compute.manager [req-589829f9-487a-46f0-9663-fed63793b14e req-7c164f2e-f9ae-4595-b6d3-9f53e80ba082 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-deleted-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:38 compute-0 podman[219274]: 2025-12-10 10:30:38.08066746 +0000 UTC m=+0.100135360 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.204 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.835 186993 DEBUG nova.compute.manager [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.835 186993 DEBUG oslo_concurrency.lockutils [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Acquiring lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.836 186993 DEBUG oslo_concurrency.lockutils [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.836 186993 DEBUG oslo_concurrency.lockutils [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] Lock "cebdf629-e283-44bf-9a2a-1514afba22b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.836 186993 DEBUG nova.compute.manager [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] No waiting events found dispatching network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 10 10:30:38 compute-0 nova_compute[186989]: 2025-12-10 10:30:38.837 186993 WARNING nova.compute.manager [req-7eb1b58e-40d2-4349-b510-32c5e6401c09 req-e2d51b9a-a179-42e6-999c-22db53bc8ff7 389379bbee9946ca98e5bd9cad84a043 e7b7021b966546e9a12475682d6813f8 - - default default] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Received unexpected event network-vif-plugged-00a1f903-cfeb-48dc-b91b-c5b35ba6d4f3 for instance with vm_state deleted and task_state None.
Dec 10 10:30:41 compute-0 nova_compute[186989]: 2025-12-10 10:30:41.466 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:41 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:30:41.830 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:30:43 compute-0 nova_compute[186989]: 2025-12-10 10:30:43.255 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:43 compute-0 nova_compute[186989]: 2025-12-10 10:30:43.709 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:43 compute-0 nova_compute[186989]: 2025-12-10 10:30:43.775 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.429 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:30:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:30:46 compute-0 nova_compute[186989]: 2025-12-10 10:30:46.468 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:47 compute-0 podman[219338]: 2025-12-10 10:30:47.01800855 +0000 UTC m=+0.063216489 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 10 10:30:48 compute-0 nova_compute[186989]: 2025-12-10 10:30:48.257 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:49 compute-0 podman[219360]: 2025-12-10 10:30:49.044947381 +0000 UTC m=+0.079456875 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:30:49 compute-0 nova_compute[186989]: 2025-12-10 10:30:49.300 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:51 compute-0 nova_compute[186989]: 2025-12-10 10:30:51.303 186993 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765362636.3022351, cebdf629-e283-44bf-9a2a-1514afba22b3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 10 10:30:51 compute-0 nova_compute[186989]: 2025-12-10 10:30:51.305 186993 INFO nova.compute.manager [-] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] VM Stopped (Lifecycle Event)
Dec 10 10:30:51 compute-0 nova_compute[186989]: 2025-12-10 10:30:51.325 186993 DEBUG nova.compute.manager [None req-e2691a7c-c88f-4641-b2d0-08109477fcf0 - - - - - -] [instance: cebdf629-e283-44bf-9a2a-1514afba22b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 10 10:30:51 compute-0 nova_compute[186989]: 2025-12-10 10:30:51.472 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:53 compute-0 nova_compute[186989]: 2025-12-10 10:30:53.260 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:54 compute-0 nova_compute[186989]: 2025-12-10 10:30:54.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:54 compute-0 nova_compute[186989]: 2025-12-10 10:30:54.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:30:55 compute-0 sshd-session[219386]: Received disconnect from 80.94.93.233 port 56240:11:  [preauth]
Dec 10 10:30:55 compute-0 sshd-session[219386]: Disconnected from authenticating user root 80.94.93.233 port 56240 [preauth]
Dec 10 10:30:55 compute-0 nova_compute[186989]: 2025-12-10 10:30:55.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:56 compute-0 nova_compute[186989]: 2025-12-10 10:30:56.474 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:56 compute-0 nova_compute[186989]: 2025-12-10 10:30:56.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:57 compute-0 nova_compute[186989]: 2025-12-10 10:30:57.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:57 compute-0 nova_compute[186989]: 2025-12-10 10:30:57.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:30:57 compute-0 nova_compute[186989]: 2025-12-10 10:30:57.923 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:30:57 compute-0 nova_compute[186989]: 2025-12-10 10:30:57.939 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:30:58 compute-0 nova_compute[186989]: 2025-12-10 10:30:58.261 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:30:58 compute-0 nova_compute[186989]: 2025-12-10 10:30:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:58 compute-0 nova_compute[186989]: 2025-12-10 10:30:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:58 compute-0 nova_compute[186989]: 2025-12-10 10:30:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:59 compute-0 nova_compute[186989]: 2025-12-10 10:30:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:30:59 compute-0 nova_compute[186989]: 2025-12-10 10:30:59.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 10 10:30:59 compute-0 nova_compute[186989]: 2025-12-10 10:30:59.941 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 10 10:31:00 compute-0 nova_compute[186989]: 2025-12-10 10:31:00.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:00 compute-0 nova_compute[186989]: 2025-12-10 10:31:00.965 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:31:00 compute-0 nova_compute[186989]: 2025-12-10 10:31:00.966 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:31:00 compute-0 nova_compute[186989]: 2025-12-10 10:31:00.966 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:31:00 compute-0 nova_compute[186989]: 2025-12-10 10:31:00.966 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:31:01 compute-0 podman[219388]: 2025-12-10 10:31:01.04282268 +0000 UTC m=+0.091526240 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.135 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.136 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.32891082763672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.136 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.136 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:31:01 compute-0 podman[219413]: 2025-12-10 10:31:01.171734743 +0000 UTC m=+0.094877700 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.400 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.400 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.476 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.505 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing inventories for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.573 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating ProviderTree inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.573 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.591 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing aggregate associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.614 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing trait associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.636 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.658 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.676 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.677 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:31:01 compute-0 nova_compute[186989]: 2025-12-10 10:31:01.678 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:03 compute-0 nova_compute[186989]: 2025-12-10 10:31:03.264 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:06 compute-0 nova_compute[186989]: 2025-12-10 10:31:06.479 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:08 compute-0 nova_compute[186989]: 2025-12-10 10:31:08.267 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:09 compute-0 podman[219432]: 2025-12-10 10:31:09.019545845 +0000 UTC m=+0.060471635 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:31:09 compute-0 podman[219434]: 2025-12-10 10:31:09.043854428 +0000 UTC m=+0.079849276 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Dec 10 10:31:09 compute-0 podman[219433]: 2025-12-10 10:31:09.049782718 +0000 UTC m=+0.084890762 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:31:10 compute-0 nova_compute[186989]: 2025-12-10 10:31:10.931 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:10 compute-0 nova_compute[186989]: 2025-12-10 10:31:10.932 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 10 10:31:11 compute-0 nova_compute[186989]: 2025-12-10 10:31:11.481 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:13 compute-0 nova_compute[186989]: 2025-12-10 10:31:13.269 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:14 compute-0 ovn_controller[95452]: 2025-12-10T10:31:14Z|00169|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec 10 10:31:16 compute-0 nova_compute[186989]: 2025-12-10 10:31:16.484 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:18 compute-0 podman[219495]: 2025-12-10 10:31:18.057532677 +0000 UTC m=+0.102938936 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Dec 10 10:31:18 compute-0 nova_compute[186989]: 2025-12-10 10:31:18.272 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:20 compute-0 podman[219517]: 2025-12-10 10:31:20.052053448 +0000 UTC m=+0.096467102 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:31:21 compute-0 nova_compute[186989]: 2025-12-10 10:31:21.515 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:23 compute-0 nova_compute[186989]: 2025-12-10 10:31:23.273 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:26 compute-0 nova_compute[186989]: 2025-12-10 10:31:26.517 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:28 compute-0 nova_compute[186989]: 2025-12-10 10:31:28.276 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:31:31.474 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:31:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:31:31.474 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:31:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:31:31.474 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:31:31 compute-0 nova_compute[186989]: 2025-12-10 10:31:31.520 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:32 compute-0 podman[219542]: 2025-12-10 10:31:32.035905507 +0000 UTC m=+0.065659154 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:31:32 compute-0 podman[219541]: 2025-12-10 10:31:32.036170484 +0000 UTC m=+0.073328401 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 10 10:31:33 compute-0 nova_compute[186989]: 2025-12-10 10:31:33.277 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:36 compute-0 nova_compute[186989]: 2025-12-10 10:31:36.523 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:38 compute-0 nova_compute[186989]: 2025-12-10 10:31:38.279 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:38 compute-0 sshd-session[219585]: Accepted publickey for zuul from 192.168.122.10 port 39376 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:31:38 compute-0 systemd-logind[787]: New session 27 of user zuul.
Dec 10 10:31:38 compute-0 systemd[1]: Started Session 27 of User zuul.
Dec 10 10:31:38 compute-0 sshd-session[219585]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:31:38 compute-0 sudo[219589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 10 10:31:38 compute-0 sudo[219589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:31:39 compute-0 podman[219623]: 2025-12-10 10:31:39.90048898 +0000 UTC m=+0.075392277 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 10 10:31:39 compute-0 podman[219624]: 2025-12-10 10:31:39.900313424 +0000 UTC m=+0.073000182 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 10 10:31:39 compute-0 podman[219625]: 2025-12-10 10:31:39.938182781 +0000 UTC m=+0.105880814 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 10 10:31:41 compute-0 nova_compute[186989]: 2025-12-10 10:31:41.524 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:43 compute-0 nova_compute[186989]: 2025-12-10 10:31:43.281 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:43 compute-0 ovs-vsctl[219822]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 10 10:31:44 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 219613 (sos)
Dec 10 10:31:44 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 10 10:31:44 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 10 10:31:44 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 10 10:31:44 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 10 10:31:44 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 10 10:31:45 compute-0 crontab[220241]: (root) LIST (root)
Dec 10 10:31:46 compute-0 nova_compute[186989]: 2025-12-10 10:31:46.528 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:47 compute-0 systemd[1]: Starting Hostname Service...
Dec 10 10:31:48 compute-0 systemd[1]: Started Hostname Service.
Dec 10 10:31:48 compute-0 nova_compute[186989]: 2025-12-10 10:31:48.282 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:49 compute-0 podman[220441]: 2025-12-10 10:31:49.036368174 +0000 UTC m=+0.075365106 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:31:49 compute-0 nova_compute[186989]: 2025-12-10 10:31:49.936 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:50 compute-0 nova_compute[186989]: 2025-12-10 10:31:50.156 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:50 compute-0 podman[220617]: 2025-12-10 10:31:50.239586926 +0000 UTC m=+0.062335126 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:31:51 compute-0 nova_compute[186989]: 2025-12-10 10:31:51.577 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:53 compute-0 nova_compute[186989]: 2025-12-10 10:31:53.285 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:54 compute-0 ovs-appctl[221428]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 10 10:31:54 compute-0 ovs-appctl[221432]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 10 10:31:54 compute-0 ovs-appctl[221436]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 10 10:31:54 compute-0 nova_compute[186989]: 2025-12-10 10:31:54.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:54 compute-0 nova_compute[186989]: 2025-12-10 10:31:54.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:31:56 compute-0 nova_compute[186989]: 2025-12-10 10:31:56.579 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:56 compute-0 nova_compute[186989]: 2025-12-10 10:31:56.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:57 compute-0 nova_compute[186989]: 2025-12-10 10:31:57.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:57 compute-0 nova_compute[186989]: 2025-12-10 10:31:57.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:31:57 compute-0 nova_compute[186989]: 2025-12-10 10:31:57.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:31:57 compute-0 nova_compute[186989]: 2025-12-10 10:31:57.944 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:31:58 compute-0 nova_compute[186989]: 2025-12-10 10:31:58.287 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:31:58 compute-0 nova_compute[186989]: 2025-12-10 10:31:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:31:59 compute-0 nova_compute[186989]: 2025-12-10 10:31:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:00 compute-0 nova_compute[186989]: 2025-12-10 10:32:00.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:00 compute-0 nova_compute[186989]: 2025-12-10 10:32:00.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:00 compute-0 nova_compute[186989]: 2025-12-10 10:32:00.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.526 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.526 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.526 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.527 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.582 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.735 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.737 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5326MB free_disk=72.94095230102539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.737 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.738 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.846 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.848 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:32:01 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.924 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.957 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.992 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:32:01 compute-0 nova_compute[186989]: 2025-12-10 10:32:01.993 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:32:02 compute-0 podman[222816]: 2025-12-10 10:32:02.185312082 +0000 UTC m=+0.103004168 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:32:02 compute-0 podman[222815]: 2025-12-10 10:32:02.192255419 +0000 UTC m=+0.110226442 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 10 10:32:03 compute-0 nova_compute[186989]: 2025-12-10 10:32:03.290 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:03 compute-0 systemd[1]: Starting Time & Date Service...
Dec 10 10:32:03 compute-0 systemd[1]: Started Time & Date Service.
Dec 10 10:32:06 compute-0 nova_compute[186989]: 2025-12-10 10:32:06.584 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:08 compute-0 nova_compute[186989]: 2025-12-10 10:32:08.290 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:10 compute-0 podman[223020]: 2025-12-10 10:32:10.051663223 +0000 UTC m=+0.087119752 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 10 10:32:10 compute-0 podman[223019]: 2025-12-10 10:32:10.062382221 +0000 UTC m=+0.091908110 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:32:10 compute-0 podman[223021]: 2025-12-10 10:32:10.083440197 +0000 UTC m=+0.117775846 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 10 10:32:11 compute-0 nova_compute[186989]: 2025-12-10 10:32:11.586 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:13 compute-0 nova_compute[186989]: 2025-12-10 10:32:13.292 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:16 compute-0 nova_compute[186989]: 2025-12-10 10:32:16.588 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:18 compute-0 nova_compute[186989]: 2025-12-10 10:32:18.294 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:20 compute-0 podman[223083]: 2025-12-10 10:32:20.059213338 +0000 UTC m=+0.086311338 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Dec 10 10:32:21 compute-0 podman[223105]: 2025-12-10 10:32:21.048700526 +0000 UTC m=+0.080665486 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:32:21 compute-0 nova_compute[186989]: 2025-12-10 10:32:21.591 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:22 compute-0 sudo[219589]: pam_unix(sudo:session): session closed for user root
Dec 10 10:32:22 compute-0 sshd-session[219588]: Received disconnect from 192.168.122.10 port 39376:11: disconnected by user
Dec 10 10:32:22 compute-0 sshd-session[219588]: Disconnected from user zuul 192.168.122.10 port 39376
Dec 10 10:32:22 compute-0 sshd-session[219585]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:32:22 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Dec 10 10:32:22 compute-0 systemd[1]: session-27.scope: Consumed 1min 14.963s CPU time, 491.9M memory peak, read 103.0M from disk, written 38.4M to disk.
Dec 10 10:32:22 compute-0 systemd-logind[787]: Session 27 logged out. Waiting for processes to exit.
Dec 10 10:32:22 compute-0 systemd-logind[787]: Removed session 27.
Dec 10 10:32:22 compute-0 sshd-session[223128]: Accepted publickey for zuul from 192.168.122.10 port 34368 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:32:23 compute-0 systemd-logind[787]: New session 28 of user zuul.
Dec 10 10:32:23 compute-0 systemd[1]: Started Session 28 of User zuul.
Dec 10 10:32:23 compute-0 sshd-session[223128]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:32:23 compute-0 sudo[223132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-10-xbilbnc.tar.xz
Dec 10 10:32:23 compute-0 sudo[223132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:32:23 compute-0 sudo[223132]: pam_unix(sudo:session): session closed for user root
Dec 10 10:32:23 compute-0 sshd-session[223131]: Received disconnect from 192.168.122.10 port 34368:11: disconnected by user
Dec 10 10:32:23 compute-0 sshd-session[223131]: Disconnected from user zuul 192.168.122.10 port 34368
Dec 10 10:32:23 compute-0 sshd-session[223128]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:32:23 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Dec 10 10:32:23 compute-0 systemd-logind[787]: Session 28 logged out. Waiting for processes to exit.
Dec 10 10:32:23 compute-0 systemd-logind[787]: Removed session 28.
Dec 10 10:32:23 compute-0 nova_compute[186989]: 2025-12-10 10:32:23.295 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:23 compute-0 sshd-session[223157]: Accepted publickey for zuul from 192.168.122.10 port 34384 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:32:23 compute-0 systemd-logind[787]: New session 29 of user zuul.
Dec 10 10:32:23 compute-0 systemd[1]: Started Session 29 of User zuul.
Dec 10 10:32:23 compute-0 sshd-session[223157]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:32:23 compute-0 sudo[223161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 10 10:32:23 compute-0 sudo[223161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:32:23 compute-0 sudo[223161]: pam_unix(sudo:session): session closed for user root
Dec 10 10:32:23 compute-0 sshd-session[223160]: Received disconnect from 192.168.122.10 port 34384:11: disconnected by user
Dec 10 10:32:23 compute-0 sshd-session[223160]: Disconnected from user zuul 192.168.122.10 port 34384
Dec 10 10:32:23 compute-0 sshd-session[223157]: pam_unix(sshd:session): session closed for user zuul
Dec 10 10:32:23 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Dec 10 10:32:23 compute-0 systemd-logind[787]: Session 29 logged out. Waiting for processes to exit.
Dec 10 10:32:23 compute-0 systemd-logind[787]: Removed session 29.
Dec 10 10:32:26 compute-0 nova_compute[186989]: 2025-12-10 10:32:26.594 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:28 compute-0 nova_compute[186989]: 2025-12-10 10:32:28.298 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:32:31.475 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:32:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:32:31.475 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:32:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:32:31.476 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:32:31 compute-0 nova_compute[186989]: 2025-12-10 10:32:31.595 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:33 compute-0 podman[223187]: 2025-12-10 10:32:33.028437418 +0000 UTC m=+0.066512032 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:32:33 compute-0 podman[223186]: 2025-12-10 10:32:33.028403857 +0000 UTC m=+0.065003761 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 10 10:32:33 compute-0 nova_compute[186989]: 2025-12-10 10:32:33.299 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:33 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 10 10:32:33 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 10 10:32:36 compute-0 nova_compute[186989]: 2025-12-10 10:32:36.597 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:38 compute-0 nova_compute[186989]: 2025-12-10 10:32:38.300 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:41 compute-0 podman[223232]: 2025-12-10 10:32:41.043575879 +0000 UTC m=+0.067573512 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:32:41 compute-0 podman[223231]: 2025-12-10 10:32:41.0462056 +0000 UTC m=+0.076106272 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 10 10:32:41 compute-0 podman[223233]: 2025-12-10 10:32:41.103611835 +0000 UTC m=+0.117434921 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 10 10:32:41 compute-0 nova_compute[186989]: 2025-12-10 10:32:41.601 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:43 compute-0 nova_compute[186989]: 2025-12-10 10:32:43.301 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:32:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:32:46 compute-0 nova_compute[186989]: 2025-12-10 10:32:46.603 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:48 compute-0 nova_compute[186989]: 2025-12-10 10:32:48.303 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:51 compute-0 podman[223295]: 2025-12-10 10:32:51.091692887 +0000 UTC m=+0.127376230 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 10 10:32:51 compute-0 podman[223316]: 2025-12-10 10:32:51.187450521 +0000 UTC m=+0.088000704 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 10 10:32:51 compute-0 nova_compute[186989]: 2025-12-10 10:32:51.604 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:52 compute-0 nova_compute[186989]: 2025-12-10 10:32:52.995 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:53 compute-0 nova_compute[186989]: 2025-12-10 10:32:53.305 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:56 compute-0 nova_compute[186989]: 2025-12-10 10:32:56.607 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:32:56 compute-0 nova_compute[186989]: 2025-12-10 10:32:56.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:56 compute-0 nova_compute[186989]: 2025-12-10 10:32:56.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:32:57 compute-0 nova_compute[186989]: 2025-12-10 10:32:57.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:57 compute-0 nova_compute[186989]: 2025-12-10 10:32:57.923 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:32:57 compute-0 nova_compute[186989]: 2025-12-10 10:32:57.923 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:32:57 compute-0 nova_compute[186989]: 2025-12-10 10:32:57.947 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:32:57 compute-0 nova_compute[186989]: 2025-12-10 10:32:57.947 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:32:58 compute-0 nova_compute[186989]: 2025-12-10 10:32:58.338 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:00 compute-0 nova_compute[186989]: 2025-12-10 10:33:00.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:00 compute-0 nova_compute[186989]: 2025-12-10 10:33:00.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:00 compute-0 nova_compute[186989]: 2025-12-10 10:33:00.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:01 compute-0 nova_compute[186989]: 2025-12-10 10:33:01.608 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:01 compute-0 nova_compute[186989]: 2025-12-10 10:33:01.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:02 compute-0 nova_compute[186989]: 2025-12-10 10:33:02.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:02 compute-0 nova_compute[186989]: 2025-12-10 10:33:02.977 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:33:02 compute-0 nova_compute[186989]: 2025-12-10 10:33:02.978 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:33:02 compute-0 nova_compute[186989]: 2025-12-10 10:33:02.978 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:33:02 compute-0 nova_compute[186989]: 2025-12-10 10:33:02.978 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.169 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.171 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5582MB free_disk=73.3286361694336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.172 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.173 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.251 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.251 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.273 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.291 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.333 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.333 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:33:03 compute-0 nova_compute[186989]: 2025-12-10 10:33:03.340 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:04 compute-0 podman[223341]: 2025-12-10 10:33:04.03630302 +0000 UTC m=+0.070377617 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 10 10:33:04 compute-0 podman[223340]: 2025-12-10 10:33:04.041250574 +0000 UTC m=+0.079803642 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 10 10:33:06 compute-0 nova_compute[186989]: 2025-12-10 10:33:06.611 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:08 compute-0 nova_compute[186989]: 2025-12-10 10:33:08.342 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:11 compute-0 nova_compute[186989]: 2025-12-10 10:33:11.612 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:12 compute-0 podman[223384]: 2025-12-10 10:33:12.065570663 +0000 UTC m=+0.075438924 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Dec 10 10:33:12 compute-0 podman[223383]: 2025-12-10 10:33:12.083814637 +0000 UTC m=+0.092785874 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:33:12 compute-0 podman[223385]: 2025-12-10 10:33:12.119830743 +0000 UTC m=+0.120194537 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 10 10:33:13 compute-0 nova_compute[186989]: 2025-12-10 10:33:13.347 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:16 compute-0 nova_compute[186989]: 2025-12-10 10:33:16.614 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:18 compute-0 nova_compute[186989]: 2025-12-10 10:33:18.386 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:21 compute-0 nova_compute[186989]: 2025-12-10 10:33:21.616 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:22 compute-0 podman[223447]: 2025-12-10 10:33:22.044314011 +0000 UTC m=+0.077267463 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:33:22 compute-0 podman[223446]: 2025-12-10 10:33:22.069969616 +0000 UTC m=+0.101558891 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64)
Dec 10 10:33:23 compute-0 nova_compute[186989]: 2025-12-10 10:33:23.390 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:26 compute-0 nova_compute[186989]: 2025-12-10 10:33:26.619 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:28 compute-0 nova_compute[186989]: 2025-12-10 10:33:28.391 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:33:31.476 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:33:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:33:31.477 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:33:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:33:31.478 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:33:31 compute-0 nova_compute[186989]: 2025-12-10 10:33:31.620 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:33 compute-0 nova_compute[186989]: 2025-12-10 10:33:33.392 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:35 compute-0 podman[223492]: 2025-12-10 10:33:35.026567553 +0000 UTC m=+0.061265241 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 10 10:33:35 compute-0 podman[223493]: 2025-12-10 10:33:35.033813479 +0000 UTC m=+0.064428606 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:33:36 compute-0 nova_compute[186989]: 2025-12-10 10:33:36.621 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:38 compute-0 nova_compute[186989]: 2025-12-10 10:33:38.429 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:41 compute-0 nova_compute[186989]: 2025-12-10 10:33:41.622 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:43 compute-0 podman[223534]: 2025-12-10 10:33:43.047702967 +0000 UTC m=+0.080576444 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:33:43 compute-0 podman[223536]: 2025-12-10 10:33:43.061081869 +0000 UTC m=+0.092191388 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 10 10:33:43 compute-0 podman[223535]: 2025-12-10 10:33:43.061373317 +0000 UTC m=+0.093983477 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 10 10:33:43 compute-0 nova_compute[186989]: 2025-12-10 10:33:43.467 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:46 compute-0 nova_compute[186989]: 2025-12-10 10:33:46.624 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:48 compute-0 nova_compute[186989]: 2025-12-10 10:33:48.469 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:51 compute-0 nova_compute[186989]: 2025-12-10 10:33:51.626 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:53 compute-0 podman[223597]: 2025-12-10 10:33:53.031785233 +0000 UTC m=+0.075309049 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 10 10:33:53 compute-0 podman[223598]: 2025-12-10 10:33:53.033439237 +0000 UTC m=+0.072443211 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:33:53 compute-0 nova_compute[186989]: 2025-12-10 10:33:53.329 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:53 compute-0 nova_compute[186989]: 2025-12-10 10:33:53.471 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:53 compute-0 nova_compute[186989]: 2025-12-10 10:33:53.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:56 compute-0 nova_compute[186989]: 2025-12-10 10:33:56.627 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:57 compute-0 nova_compute[186989]: 2025-12-10 10:33:57.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:57 compute-0 nova_compute[186989]: 2025-12-10 10:33:57.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:33:58 compute-0 nova_compute[186989]: 2025-12-10 10:33:58.517 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:33:58 compute-0 nova_compute[186989]: 2025-12-10 10:33:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:59 compute-0 nova_compute[186989]: 2025-12-10 10:33:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:33:59 compute-0 nova_compute[186989]: 2025-12-10 10:33:59.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:33:59 compute-0 nova_compute[186989]: 2025-12-10 10:33:59.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:33:59 compute-0 nova_compute[186989]: 2025-12-10 10:33:59.950 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:34:00 compute-0 nova_compute[186989]: 2025-12-10 10:34:00.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:01 compute-0 nova_compute[186989]: 2025-12-10 10:34:01.629 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:01 compute-0 nova_compute[186989]: 2025-12-10 10:34:01.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:01 compute-0 nova_compute[186989]: 2025-12-10 10:34:01.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:01 compute-0 nova_compute[186989]: 2025-12-10 10:34:01.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.520 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.979 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.980 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.981 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:34:03 compute-0 nova_compute[186989]: 2025-12-10 10:34:03.981 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.159 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.160 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5608MB free_disk=73.32861709594727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.160 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.161 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.495 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.496 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.522 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.627 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.630 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:34:04 compute-0 nova_compute[186989]: 2025-12-10 10:34:04.630 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:34:06 compute-0 podman[223643]: 2025-12-10 10:34:06.031965738 +0000 UTC m=+0.068332781 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 10 10:34:06 compute-0 podman[223644]: 2025-12-10 10:34:06.035758721 +0000 UTC m=+0.063346667 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:34:06 compute-0 nova_compute[186989]: 2025-12-10 10:34:06.631 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:08 compute-0 nova_compute[186989]: 2025-12-10 10:34:08.522 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:11 compute-0 nova_compute[186989]: 2025-12-10 10:34:11.632 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:13 compute-0 nova_compute[186989]: 2025-12-10 10:34:13.523 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:14 compute-0 podman[223686]: 2025-12-10 10:34:14.049627897 +0000 UTC m=+0.074357494 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 10 10:34:14 compute-0 podman[223687]: 2025-12-10 10:34:14.06009056 +0000 UTC m=+0.077922651 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:34:14 compute-0 podman[223688]: 2025-12-10 10:34:14.071320864 +0000 UTC m=+0.090807980 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:34:16 compute-0 nova_compute[186989]: 2025-12-10 10:34:16.634 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:18 compute-0 nova_compute[186989]: 2025-12-10 10:34:18.575 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:21 compute-0 nova_compute[186989]: 2025-12-10 10:34:21.635 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:23 compute-0 nova_compute[186989]: 2025-12-10 10:34:23.577 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:24 compute-0 podman[223748]: 2025-12-10 10:34:24.019812276 +0000 UTC m=+0.053389488 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:34:24 compute-0 podman[223747]: 2025-12-10 10:34:24.020690599 +0000 UTC m=+0.060697154 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, vendor=Red Hat, Inc.)
Dec 10 10:34:26 compute-0 nova_compute[186989]: 2025-12-10 10:34:26.637 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:28 compute-0 nova_compute[186989]: 2025-12-10 10:34:28.578 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:34:31.478 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:34:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:34:31.478 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:34:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:34:31.478 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:34:31 compute-0 nova_compute[186989]: 2025-12-10 10:34:31.639 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:33 compute-0 nova_compute[186989]: 2025-12-10 10:34:33.580 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:36 compute-0 nova_compute[186989]: 2025-12-10 10:34:36.641 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:37 compute-0 podman[223789]: 2025-12-10 10:34:37.02844433 +0000 UTC m=+0.072088737 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 10 10:34:37 compute-0 podman[223790]: 2025-12-10 10:34:37.067817209 +0000 UTC m=+0.101618239 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:34:38 compute-0 nova_compute[186989]: 2025-12-10 10:34:38.584 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:41 compute-0 nova_compute[186989]: 2025-12-10 10:34:41.643 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:43 compute-0 nova_compute[186989]: 2025-12-10 10:34:43.586 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:45 compute-0 podman[223832]: 2025-12-10 10:34:45.068664499 +0000 UTC m=+0.097426886 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Dec 10 10:34:45 compute-0 podman[223833]: 2025-12-10 10:34:45.083869611 +0000 UTC m=+0.105693129 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 10 10:34:45 compute-0 podman[223834]: 2025-12-10 10:34:45.129695964 +0000 UTC m=+0.152208331 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:34:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:34:46 compute-0 nova_compute[186989]: 2025-12-10 10:34:46.645 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:48 compute-0 nova_compute[186989]: 2025-12-10 10:34:48.588 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:51 compute-0 nova_compute[186989]: 2025-12-10 10:34:51.648 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:53 compute-0 nova_compute[186989]: 2025-12-10 10:34:53.625 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:55 compute-0 podman[223898]: 2025-12-10 10:34:55.015822001 +0000 UTC m=+0.060559864 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Dec 10 10:34:55 compute-0 podman[223899]: 2025-12-10 10:34:55.031457956 +0000 UTC m=+0.070028262 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:34:56 compute-0 nova_compute[186989]: 2025-12-10 10:34:56.631 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:56 compute-0 nova_compute[186989]: 2025-12-10 10:34:56.649 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:58 compute-0 nova_compute[186989]: 2025-12-10 10:34:58.626 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:34:58 compute-0 nova_compute[186989]: 2025-12-10 10:34:58.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:34:58 compute-0 nova_compute[186989]: 2025-12-10 10:34:58.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:34:59 compute-0 nova_compute[186989]: 2025-12-10 10:34:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:00 compute-0 nova_compute[186989]: 2025-12-10 10:35:00.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:00 compute-0 nova_compute[186989]: 2025-12-10 10:35:00.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:35:00 compute-0 nova_compute[186989]: 2025-12-10 10:35:00.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:35:00 compute-0 nova_compute[186989]: 2025-12-10 10:35:00.948 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:35:01 compute-0 anacron[48455]: Job `cron.daily' started
Dec 10 10:35:01 compute-0 anacron[48455]: Job `cron.daily' terminated
Dec 10 10:35:01 compute-0 nova_compute[186989]: 2025-12-10 10:35:01.650 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:01 compute-0 nova_compute[186989]: 2025-12-10 10:35:01.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:01 compute-0 nova_compute[186989]: 2025-12-10 10:35:01.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:01 compute-0 nova_compute[186989]: 2025-12-10 10:35:01.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:02 compute-0 nova_compute[186989]: 2025-12-10 10:35:02.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.671 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.957 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.957 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.958 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:35:03 compute-0 nova_compute[186989]: 2025-12-10 10:35:03.958 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.160 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.162 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5626MB free_disk=73.32861328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.162 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.162 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.241 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.241 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.272 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.287 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.290 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:35:04 compute-0 nova_compute[186989]: 2025-12-10 10:35:04.290 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:35:06 compute-0 nova_compute[186989]: 2025-12-10 10:35:06.652 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:08 compute-0 podman[223944]: 2025-12-10 10:35:08.047743522 +0000 UTC m=+0.079312524 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 10 10:35:08 compute-0 podman[223945]: 2025-12-10 10:35:08.066514461 +0000 UTC m=+0.092747918 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:35:08 compute-0 nova_compute[186989]: 2025-12-10 10:35:08.676 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:11 compute-0 nova_compute[186989]: 2025-12-10 10:35:11.653 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:13 compute-0 nova_compute[186989]: 2025-12-10 10:35:13.721 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:16 compute-0 podman[223983]: 2025-12-10 10:35:16.059776175 +0000 UTC m=+0.102076281 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:35:16 compute-0 podman[223984]: 2025-12-10 10:35:16.078986837 +0000 UTC m=+0.105318429 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 10 10:35:16 compute-0 podman[223988]: 2025-12-10 10:35:16.120630358 +0000 UTC m=+0.141511013 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Dec 10 10:35:16 compute-0 nova_compute[186989]: 2025-12-10 10:35:16.655 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:18 compute-0 nova_compute[186989]: 2025-12-10 10:35:18.724 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:21 compute-0 nova_compute[186989]: 2025-12-10 10:35:21.698 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:23 compute-0 nova_compute[186989]: 2025-12-10 10:35:23.726 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:26 compute-0 podman[224048]: 2025-12-10 10:35:26.038739821 +0000 UTC m=+0.083248591 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 10 10:35:26 compute-0 podman[224049]: 2025-12-10 10:35:26.04938273 +0000 UTC m=+0.083817206 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:35:26 compute-0 nova_compute[186989]: 2025-12-10 10:35:26.700 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:28 compute-0 nova_compute[186989]: 2025-12-10 10:35:28.730 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:31.479 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:35:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:31.479 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:35:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:31.479 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:35:31 compute-0 nova_compute[186989]: 2025-12-10 10:35:31.704 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:33 compute-0 nova_compute[186989]: 2025-12-10 10:35:33.733 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:35 compute-0 nova_compute[186989]: 2025-12-10 10:35:35.277 186993 DEBUG oslo_concurrency.processutils [None req-24eb72c8-33aa-468c-bd66-115c8421c770 adc515f143694adaa52854065b1d8fc6 c2231ca0f94b4b4fa9b96f4406a080b9 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 10 10:35:35 compute-0 nova_compute[186989]: 2025-12-10 10:35:35.300 186993 DEBUG oslo_concurrency.processutils [None req-24eb72c8-33aa-468c-bd66-115c8421c770 adc515f143694adaa52854065b1d8fc6 c2231ca0f94b4b4fa9b96f4406a080b9 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 10 10:35:36 compute-0 nova_compute[186989]: 2025-12-10 10:35:36.706 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:38 compute-0 nova_compute[186989]: 2025-12-10 10:35:38.735 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:39 compute-0 podman[224093]: 2025-12-10 10:35:39.038626664 +0000 UTC m=+0.074482702 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:35:39 compute-0 podman[224092]: 2025-12-10 10:35:39.058435041 +0000 UTC m=+0.090954749 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:35:41 compute-0 nova_compute[186989]: 2025-12-10 10:35:41.709 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:43.362 104302 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '46:d5:69', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:b1:dd:ed:fa:0b'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 10 10:35:43 compute-0 nova_compute[186989]: 2025-12-10 10:35:43.363 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:43 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:43.364 104302 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 10 10:35:43 compute-0 nova_compute[186989]: 2025-12-10 10:35:43.752 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:46 compute-0 nova_compute[186989]: 2025-12-10 10:35:46.711 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:47 compute-0 podman[224137]: 2025-12-10 10:35:47.061298927 +0000 UTC m=+0.096864300 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 10 10:35:47 compute-0 podman[224136]: 2025-12-10 10:35:47.06180629 +0000 UTC m=+0.094547796 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:35:47 compute-0 podman[224138]: 2025-12-10 10:35:47.096711407 +0000 UTC m=+0.119941066 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:35:48 compute-0 nova_compute[186989]: 2025-12-10 10:35:48.754 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:50 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:35:50.366 104302 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=65d7f098-ee7c-47ff-b5dd-8c0c64a94f34, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 10 10:35:51 compute-0 nova_compute[186989]: 2025-12-10 10:35:51.713 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:53 compute-0 nova_compute[186989]: 2025-12-10 10:35:53.800 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:56 compute-0 nova_compute[186989]: 2025-12-10 10:35:56.292 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:56 compute-0 nova_compute[186989]: 2025-12-10 10:35:56.714 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:57 compute-0 podman[224197]: 2025-12-10 10:35:57.048770712 +0000 UTC m=+0.079643842 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Dec 10 10:35:57 compute-0 podman[224198]: 2025-12-10 10:35:57.063497441 +0000 UTC m=+0.094532886 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:35:57 compute-0 nova_compute[186989]: 2025-12-10 10:35:57.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:58 compute-0 nova_compute[186989]: 2025-12-10 10:35:58.803 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:35:58 compute-0 nova_compute[186989]: 2025-12-10 10:35:58.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:58 compute-0 nova_compute[186989]: 2025-12-10 10:35:58.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:35:59 compute-0 nova_compute[186989]: 2025-12-10 10:35:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:35:59 compute-0 nova_compute[186989]: 2025-12-10 10:35:59.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 10 10:36:00 compute-0 nova_compute[186989]: 2025-12-10 10:36:00.216 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 10 10:36:01 compute-0 nova_compute[186989]: 2025-12-10 10:36:01.715 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.216 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.217 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.218 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.232 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.233 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:02 compute-0 nova_compute[186989]: 2025-12-10 10:36:02.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:03 compute-0 nova_compute[186989]: 2025-12-10 10:36:03.805 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:03 compute-0 nova_compute[186989]: 2025-12-10 10:36:03.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:03 compute-0 nova_compute[186989]: 2025-12-10 10:36:03.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:03 compute-0 nova_compute[186989]: 2025-12-10 10:36:03.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:04 compute-0 nova_compute[186989]: 2025-12-10 10:36:04.475 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:04 compute-0 nova_compute[186989]: 2025-12-10 10:36:04.948 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.161 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.161 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.162 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.162 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.446 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.447 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.32951354980469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.448 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.448 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.633 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.634 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.748 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing inventories for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.852 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating ProviderTree inventory for provider 94de3f96-a911-486c-b08b-8a5da489baa6 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.853 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Updating inventory in ProviderTree for provider 94de3f96-a911-486c-b08b-8a5da489baa6 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.873 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing aggregate associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.910 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Refreshing trait associations for resource provider 94de3f96-a911-486c-b08b-8a5da489baa6, traits: HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_ACCELERATORS,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_AVX2,HW_CPU_X86_AVX,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.937 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.953 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.954 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:36:05 compute-0 nova_compute[186989]: 2025-12-10 10:36:05.955 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:36:06 compute-0 nova_compute[186989]: 2025-12-10 10:36:06.717 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:08 compute-0 nova_compute[186989]: 2025-12-10 10:36:08.806 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:08 compute-0 nova_compute[186989]: 2025-12-10 10:36:08.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:10 compute-0 podman[224243]: 2025-12-10 10:36:10.062987192 +0000 UTC m=+0.094901927 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:36:10 compute-0 podman[224242]: 2025-12-10 10:36:10.074597287 +0000 UTC m=+0.108981789 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:36:11 compute-0 nova_compute[186989]: 2025-12-10 10:36:11.718 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:13 compute-0 nova_compute[186989]: 2025-12-10 10:36:13.809 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:16 compute-0 nova_compute[186989]: 2025-12-10 10:36:16.721 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:18 compute-0 podman[224285]: 2025-12-10 10:36:18.05312055 +0000 UTC m=+0.082639473 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:36:18 compute-0 podman[224286]: 2025-12-10 10:36:18.079195698 +0000 UTC m=+0.101081154 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 10 10:36:18 compute-0 podman[224287]: 2025-12-10 10:36:18.118933707 +0000 UTC m=+0.136430894 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 10 10:36:18 compute-0 nova_compute[186989]: 2025-12-10 10:36:18.810 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:19 compute-0 nova_compute[186989]: 2025-12-10 10:36:19.944 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:19 compute-0 nova_compute[186989]: 2025-12-10 10:36:19.944 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 10 10:36:21 compute-0 nova_compute[186989]: 2025-12-10 10:36:21.722 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:23 compute-0 nova_compute[186989]: 2025-12-10 10:36:23.813 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:26 compute-0 nova_compute[186989]: 2025-12-10 10:36:26.724 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:28 compute-0 podman[224346]: 2025-12-10 10:36:28.035391106 +0000 UTC m=+0.073770443 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 10 10:36:28 compute-0 podman[224347]: 2025-12-10 10:36:28.043823135 +0000 UTC m=+0.088394150 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:36:28 compute-0 nova_compute[186989]: 2025-12-10 10:36:28.814 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:36:31.480 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:36:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:36:31.480 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:36:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:36:31.481 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:36:31 compute-0 nova_compute[186989]: 2025-12-10 10:36:31.725 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:33 compute-0 nova_compute[186989]: 2025-12-10 10:36:33.814 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:36 compute-0 nova_compute[186989]: 2025-12-10 10:36:36.726 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:38 compute-0 nova_compute[186989]: 2025-12-10 10:36:38.817 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:41 compute-0 podman[224389]: 2025-12-10 10:36:41.048539139 +0000 UTC m=+0.083313432 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 10 10:36:41 compute-0 podman[224390]: 2025-12-10 10:36:41.075307755 +0000 UTC m=+0.108296280 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:36:41 compute-0 nova_compute[186989]: 2025-12-10 10:36:41.728 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:43 compute-0 nova_compute[186989]: 2025-12-10 10:36:43.822 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.430 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.435 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:36:45.436 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:36:46 compute-0 nova_compute[186989]: 2025-12-10 10:36:46.729 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:48 compute-0 nova_compute[186989]: 2025-12-10 10:36:48.824 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:49 compute-0 podman[224433]: 2025-12-10 10:36:49.042070729 +0000 UTC m=+0.086580210 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:36:49 compute-0 podman[224434]: 2025-12-10 10:36:49.059149973 +0000 UTC m=+0.092746768 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:36:49 compute-0 podman[224436]: 2025-12-10 10:36:49.090674049 +0000 UTC m=+0.113314547 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 10 10:36:51 compute-0 nova_compute[186989]: 2025-12-10 10:36:51.731 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:53 compute-0 nova_compute[186989]: 2025-12-10 10:36:53.826 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:56 compute-0 nova_compute[186989]: 2025-12-10 10:36:56.036 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:56 compute-0 nova_compute[186989]: 2025-12-10 10:36:56.733 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:58 compute-0 nova_compute[186989]: 2025-12-10 10:36:58.827 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:36:59 compute-0 podman[224499]: 2025-12-10 10:36:59.036170204 +0000 UTC m=+0.065586661 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:36:59 compute-0 podman[224498]: 2025-12-10 10:36:59.067593177 +0000 UTC m=+0.104500427 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git)
Dec 10 10:36:59 compute-0 nova_compute[186989]: 2025-12-10 10:36:59.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:36:59 compute-0 nova_compute[186989]: 2025-12-10 10:36:59.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:37:01 compute-0 nova_compute[186989]: 2025-12-10 10:37:01.735 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:02 compute-0 nova_compute[186989]: 2025-12-10 10:37:02.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:03 compute-0 nova_compute[186989]: 2025-12-10 10:37:03.828 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:03 compute-0 nova_compute[186989]: 2025-12-10 10:37:03.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:03 compute-0 nova_compute[186989]: 2025-12-10 10:37:03.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:37:03 compute-0 nova_compute[186989]: 2025-12-10 10:37:03.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:37:03 compute-0 nova_compute[186989]: 2025-12-10 10:37:03.939 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.922 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.951 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.952 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:37:04 compute-0 nova_compute[186989]: 2025-12-10 10:37:04.952 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.132 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.133 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5680MB free_disk=73.32951354980469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.134 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.134 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.204 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.205 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.236 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.256 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.258 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:37:05 compute-0 nova_compute[186989]: 2025-12-10 10:37:05.258 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:37:06 compute-0 nova_compute[186989]: 2025-12-10 10:37:06.254 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:06 compute-0 nova_compute[186989]: 2025-12-10 10:37:06.737 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:08 compute-0 nova_compute[186989]: 2025-12-10 10:37:08.837 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:11 compute-0 nova_compute[186989]: 2025-12-10 10:37:11.738 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:12 compute-0 podman[224541]: 2025-12-10 10:37:12.039798726 +0000 UTC m=+0.078184632 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:37:12 compute-0 podman[224540]: 2025-12-10 10:37:12.039746695 +0000 UTC m=+0.074483292 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 10 10:37:13 compute-0 nova_compute[186989]: 2025-12-10 10:37:13.840 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:16 compute-0 nova_compute[186989]: 2025-12-10 10:37:16.741 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:18 compute-0 nova_compute[186989]: 2025-12-10 10:37:18.882 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:20 compute-0 podman[224582]: 2025-12-10 10:37:20.02454511 +0000 UTC m=+0.061798278 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 10 10:37:20 compute-0 podman[224583]: 2025-12-10 10:37:20.030764499 +0000 UTC m=+0.061057308 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 10 10:37:20 compute-0 podman[224590]: 2025-12-10 10:37:20.116157076 +0000 UTC m=+0.137122582 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 10 10:37:21 compute-0 nova_compute[186989]: 2025-12-10 10:37:21.743 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:23 compute-0 nova_compute[186989]: 2025-12-10 10:37:23.885 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:26 compute-0 nova_compute[186989]: 2025-12-10 10:37:26.746 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:28 compute-0 nova_compute[186989]: 2025-12-10 10:37:28.887 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:30 compute-0 podman[224647]: 2025-12-10 10:37:30.018563024 +0000 UTC m=+0.057110181 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 10 10:37:30 compute-0 podman[224646]: 2025-12-10 10:37:30.019749326 +0000 UTC m=+0.063768332 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 10 10:37:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:37:31.480 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:37:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:37:31.482 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:37:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:37:31.482 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:37:31 compute-0 nova_compute[186989]: 2025-12-10 10:37:31.748 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:33 compute-0 nova_compute[186989]: 2025-12-10 10:37:33.888 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:36 compute-0 nova_compute[186989]: 2025-12-10 10:37:36.749 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:38 compute-0 nova_compute[186989]: 2025-12-10 10:37:38.889 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:41 compute-0 nova_compute[186989]: 2025-12-10 10:37:41.751 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:43 compute-0 podman[224689]: 2025-12-10 10:37:43.032928198 +0000 UTC m=+0.064053480 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 10 10:37:43 compute-0 podman[224690]: 2025-12-10 10:37:43.039756633 +0000 UTC m=+0.065731085 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:37:43 compute-0 nova_compute[186989]: 2025-12-10 10:37:43.890 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:46 compute-0 nova_compute[186989]: 2025-12-10 10:37:46.753 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:48 compute-0 nova_compute[186989]: 2025-12-10 10:37:48.892 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:51 compute-0 podman[224733]: 2025-12-10 10:37:51.028086654 +0000 UTC m=+0.062043565 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:37:51 compute-0 podman[224732]: 2025-12-10 10:37:51.032343669 +0000 UTC m=+0.074983786 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:37:51 compute-0 podman[224735]: 2025-12-10 10:37:51.086826498 +0000 UTC m=+0.108381022 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:37:51 compute-0 nova_compute[186989]: 2025-12-10 10:37:51.755 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:53 compute-0 nova_compute[186989]: 2025-12-10 10:37:53.892 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:55 compute-0 nova_compute[186989]: 2025-12-10 10:37:55.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:37:56 compute-0 nova_compute[186989]: 2025-12-10 10:37:56.757 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:58 compute-0 nova_compute[186989]: 2025-12-10 10:37:58.894 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:37:59 compute-0 nova_compute[186989]: 2025-12-10 10:37:59.916 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:01 compute-0 podman[224794]: 2025-12-10 10:38:01.002496206 +0000 UTC m=+0.052901857 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Dec 10 10:38:01 compute-0 podman[224795]: 2025-12-10 10:38:01.010409101 +0000 UTC m=+0.050763488 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:38:01 compute-0 nova_compute[186989]: 2025-12-10 10:38:01.759 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:01 compute-0 nova_compute[186989]: 2025-12-10 10:38:01.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:01 compute-0 nova_compute[186989]: 2025-12-10 10:38:01.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:38:03 compute-0 nova_compute[186989]: 2025-12-10 10:38:03.896 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:03 compute-0 nova_compute[186989]: 2025-12-10 10:38:03.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:03 compute-0 nova_compute[186989]: 2025-12-10 10:38:03.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:38:03 compute-0 nova_compute[186989]: 2025-12-10 10:38:03.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:38:03 compute-0 nova_compute[186989]: 2025-12-10 10:38:03.937 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:38:04 compute-0 nova_compute[186989]: 2025-12-10 10:38:04.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:04 compute-0 nova_compute[186989]: 2025-12-10 10:38:04.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.917 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.963 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.963 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.964 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:38:05 compute-0 nova_compute[186989]: 2025-12-10 10:38:05.964 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.172 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.173 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5681MB free_disk=73.32951354980469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.173 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.173 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.249 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.249 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.278 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.300 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.302 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.302 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:38:06 compute-0 nova_compute[186989]: 2025-12-10 10:38:06.761 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:07 compute-0 nova_compute[186989]: 2025-12-10 10:38:07.304 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:08 compute-0 nova_compute[186989]: 2025-12-10 10:38:08.898 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:11 compute-0 nova_compute[186989]: 2025-12-10 10:38:11.763 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:13 compute-0 nova_compute[186989]: 2025-12-10 10:38:13.899 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:14 compute-0 podman[224840]: 2025-12-10 10:38:14.021345701 +0000 UTC m=+0.056472973 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:38:14 compute-0 podman[224839]: 2025-12-10 10:38:14.053593137 +0000 UTC m=+0.093636002 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:38:16 compute-0 nova_compute[186989]: 2025-12-10 10:38:16.765 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:18 compute-0 nova_compute[186989]: 2025-12-10 10:38:18.901 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:21 compute-0 nova_compute[186989]: 2025-12-10 10:38:21.767 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:22 compute-0 podman[224881]: 2025-12-10 10:38:22.024561665 +0000 UTC m=+0.068989673 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 10 10:38:22 compute-0 podman[224882]: 2025-12-10 10:38:22.029774927 +0000 UTC m=+0.069189009 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd)
Dec 10 10:38:22 compute-0 podman[224883]: 2025-12-10 10:38:22.056150203 +0000 UTC m=+0.093032107 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller)
Dec 10 10:38:23 compute-0 nova_compute[186989]: 2025-12-10 10:38:23.904 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:26 compute-0 nova_compute[186989]: 2025-12-10 10:38:26.769 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:28 compute-0 nova_compute[186989]: 2025-12-10 10:38:28.905 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:38:31.483 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:38:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:38:31.483 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:38:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:38:31.484 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:38:31 compute-0 nova_compute[186989]: 2025-12-10 10:38:31.771 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:32 compute-0 podman[224942]: 2025-12-10 10:38:32.018917547 +0000 UTC m=+0.057928133 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:38:32 compute-0 podman[224941]: 2025-12-10 10:38:32.039117936 +0000 UTC m=+0.079294394 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., release=1755695350, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 10 10:38:33 compute-0 nova_compute[186989]: 2025-12-10 10:38:33.907 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:36 compute-0 nova_compute[186989]: 2025-12-10 10:38:36.772 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:38 compute-0 nova_compute[186989]: 2025-12-10 10:38:38.940 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:41 compute-0 nova_compute[186989]: 2025-12-10 10:38:41.774 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:43 compute-0 nova_compute[186989]: 2025-12-10 10:38:43.943 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:45 compute-0 podman[224987]: 2025-12-10 10:38:45.037177367 +0000 UTC m=+0.074784410 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 10 10:38:45 compute-0 podman[224986]: 2025-12-10 10:38:45.043069687 +0000 UTC m=+0.085984874 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.431 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.432 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:45 compute-0 ceilometer_agent_compute[197748]: 2025-12-10 10:38:45.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 10 10:38:46 compute-0 nova_compute[186989]: 2025-12-10 10:38:46.777 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:48 compute-0 nova_compute[186989]: 2025-12-10 10:38:48.944 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:51 compute-0 nova_compute[186989]: 2025-12-10 10:38:51.778 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:53 compute-0 podman[225028]: 2025-12-10 10:38:53.05189641 +0000 UTC m=+0.083115102 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:38:53 compute-0 podman[225026]: 2025-12-10 10:38:53.052888557 +0000 UTC m=+0.085507097 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:38:53 compute-0 podman[225027]: 2025-12-10 10:38:53.05664622 +0000 UTC m=+0.087406120 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 10 10:38:53 compute-0 nova_compute[186989]: 2025-12-10 10:38:53.948 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:56 compute-0 nova_compute[186989]: 2025-12-10 10:38:56.779 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:38:56 compute-0 nova_compute[186989]: 2025-12-10 10:38:56.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:38:58 compute-0 nova_compute[186989]: 2025-12-10 10:38:58.952 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:01 compute-0 nova_compute[186989]: 2025-12-10 10:39:01.781 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:03 compute-0 podman[225090]: 2025-12-10 10:39:03.845593497 +0000 UTC m=+0.056199727 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 10 10:39:03 compute-0 podman[225089]: 2025-12-10 10:39:03.869202342 +0000 UTC m=+0.076009068 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.921 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.937 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.938 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.938 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:39:03 compute-0 nova_compute[186989]: 2025-12-10 10:39:03.954 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:05 compute-0 nova_compute[186989]: 2025-12-10 10:39:05.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:06 compute-0 nova_compute[186989]: 2025-12-10 10:39:06.802 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:06 compute-0 nova_compute[186989]: 2025-12-10 10:39:06.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.917 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.954 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.954 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.954 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:39:07 compute-0 nova_compute[186989]: 2025-12-10 10:39:07.955 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.128 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.129 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.32953262329102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.129 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.129 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.189 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.190 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.209 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.222 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.223 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.224 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:39:08 compute-0 nova_compute[186989]: 2025-12-10 10:39:08.957 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:11 compute-0 nova_compute[186989]: 2025-12-10 10:39:11.843 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:13 compute-0 nova_compute[186989]: 2025-12-10 10:39:13.958 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:16 compute-0 podman[225133]: 2025-12-10 10:39:16.025897826 +0000 UTC m=+0.064972736 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:39:16 compute-0 podman[225134]: 2025-12-10 10:39:16.076208041 +0000 UTC m=+0.100911948 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 10 10:39:16 compute-0 nova_compute[186989]: 2025-12-10 10:39:16.876 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:18 compute-0 nova_compute[186989]: 2025-12-10 10:39:18.960 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:21 compute-0 nova_compute[186989]: 2025-12-10 10:39:21.880 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:23 compute-0 nova_compute[186989]: 2025-12-10 10:39:23.963 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:24 compute-0 podman[225175]: 2025-12-10 10:39:24.055553439 +0000 UTC m=+0.082625539 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 10 10:39:24 compute-0 podman[225176]: 2025-12-10 10:39:24.06621967 +0000 UTC m=+0.083930745 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 10 10:39:24 compute-0 podman[225177]: 2025-12-10 10:39:24.106917482 +0000 UTC m=+0.118966822 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 10 10:39:26 compute-0 nova_compute[186989]: 2025-12-10 10:39:26.884 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:28 compute-0 nova_compute[186989]: 2025-12-10 10:39:28.964 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:39:31.484 104302 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:39:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:39:31.485 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:39:31 compute-0 ovn_metadata_agent[104297]: 2025-12-10 10:39:31.486 104302 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:39:31 compute-0 nova_compute[186989]: 2025-12-10 10:39:31.888 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:33 compute-0 nova_compute[186989]: 2025-12-10 10:39:33.966 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:34 compute-0 podman[225237]: 2025-12-10 10:39:34.031972801 +0000 UTC m=+0.060304599 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 10 10:39:34 compute-0 podman[225236]: 2025-12-10 10:39:34.036000921 +0000 UTC m=+0.069225853 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Dec 10 10:39:36 compute-0 nova_compute[186989]: 2025-12-10 10:39:36.913 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:38 compute-0 nova_compute[186989]: 2025-12-10 10:39:38.969 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:41 compute-0 nova_compute[186989]: 2025-12-10 10:39:41.917 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:44 compute-0 nova_compute[186989]: 2025-12-10 10:39:44.010 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:46 compute-0 nova_compute[186989]: 2025-12-10 10:39:46.920 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:47 compute-0 podman[225282]: 2025-12-10 10:39:47.025677378 +0000 UTC m=+0.057802550 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:39:47 compute-0 podman[225281]: 2025-12-10 10:39:47.071264945 +0000 UTC m=+0.100322333 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 10 10:39:49 compute-0 nova_compute[186989]: 2025-12-10 10:39:49.012 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:51 compute-0 nova_compute[186989]: 2025-12-10 10:39:51.925 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:54 compute-0 nova_compute[186989]: 2025-12-10 10:39:54.013 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:55 compute-0 podman[225325]: 2025-12-10 10:39:55.038448128 +0000 UTC m=+0.075818953 container health_status 1461fc55a1f6049a1d5279ae6ae3a08e818d488ecc554fd2fdc977e5f2fc76bb (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 10 10:39:55 compute-0 podman[225326]: 2025-12-10 10:39:55.043483566 +0000 UTC m=+0.075430643 container health_status 16d1756dd065e2f28b29a548c632c664732e1caed5bdc5d3a66af354d696fd5a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 10 10:39:55 compute-0 podman[225327]: 2025-12-10 10:39:55.117567849 +0000 UTC m=+0.140740146 container health_status e0bd9db3306642e1ff2484f3d2493dda14fd132def1cb9f57364dd3e62ea14b6 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 10 10:39:56 compute-0 nova_compute[186989]: 2025-12-10 10:39:56.927 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:39:58 compute-0 nova_compute[186989]: 2025-12-10 10:39:58.226 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:39:59 compute-0 nova_compute[186989]: 2025-12-10 10:39:59.014 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:01 compute-0 nova_compute[186989]: 2025-12-10 10:40:01.931 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:03 compute-0 nova_compute[186989]: 2025-12-10 10:40:03.917 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:03 compute-0 nova_compute[186989]: 2025-12-10 10:40:03.946 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:03 compute-0 nova_compute[186989]: 2025-12-10 10:40:03.947 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 10 10:40:04 compute-0 nova_compute[186989]: 2025-12-10 10:40:04.016 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:04 compute-0 nova_compute[186989]: 2025-12-10 10:40:04.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:04 compute-0 nova_compute[186989]: 2025-12-10 10:40:04.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 10 10:40:04 compute-0 nova_compute[186989]: 2025-12-10 10:40:04.922 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 10 10:40:04 compute-0 nova_compute[186989]: 2025-12-10 10:40:04.939 186993 DEBUG nova.compute.manager [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 10 10:40:05 compute-0 podman[225388]: 2025-12-10 10:40:05.013505863 +0000 UTC m=+0.051772116 container health_status 926918b86b7d0401cfa9a723f75a1ad2f09defee7c35d1abc233d414eef09a12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 10 10:40:05 compute-0 podman[225387]: 2025-12-10 10:40:05.031683159 +0000 UTC m=+0.067833024 container health_status 70573056c2f1509f7aeadc66b7872d48309a628072969f2773ae2988b653145e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 10 10:40:06 compute-0 nova_compute[186989]: 2025-12-10 10:40:06.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:06 compute-0 nova_compute[186989]: 2025-12-10 10:40:06.921 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:06 compute-0 nova_compute[186989]: 2025-12-10 10:40:06.982 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:07 compute-0 sshd-session[225431]: Accepted publickey for zuul from 192.168.122.10 port 57828 ssh2: ECDSA SHA256:8OpKJxU5jcFLQSGXY13tKWBgmII6DvHAlV4aCFrjtTo
Dec 10 10:40:07 compute-0 systemd-logind[787]: New session 30 of user zuul.
Dec 10 10:40:07 compute-0 systemd[1]: Started Session 30 of User zuul.
Dec 10 10:40:07 compute-0 sshd-session[225431]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 10 10:40:07 compute-0 sudo[225435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 10 10:40:07 compute-0 sudo[225435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 10 10:40:07 compute-0 nova_compute[186989]: 2025-12-10 10:40:07.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:08 compute-0 nova_compute[186989]: 2025-12-10 10:40:08.920 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:08 compute-0 nova_compute[186989]: 2025-12-10 10:40:08.974 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:40:08 compute-0 nova_compute[186989]: 2025-12-10 10:40:08.975 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:40:08 compute-0 nova_compute[186989]: 2025-12-10 10:40:08.976 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:40:08 compute-0 nova_compute[186989]: 2025-12-10 10:40:08.976 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.018 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.125 186993 WARNING nova.virt.libvirt.driver [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.127 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5680MB free_disk=73.32949447631836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.127 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.127 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.210 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.210 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.244 186993 DEBUG nova.compute.provider_tree [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed in ProviderTree for provider: 94de3f96-a911-486c-b08b-8a5da489baa6 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.264 186993 DEBUG nova.scheduler.client.report [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Inventory has not changed for provider 94de3f96-a911-486c-b08b-8a5da489baa6 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.265 186993 DEBUG nova.compute.resource_tracker [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 10 10:40:09 compute-0 nova_compute[186989]: 2025-12-10 10:40:09.265 186993 DEBUG oslo_concurrency.lockutils [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 10 10:40:10 compute-0 nova_compute[186989]: 2025-12-10 10:40:10.261 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:10 compute-0 nova_compute[186989]: 2025-12-10 10:40:10.262 186993 DEBUG oslo_service.periodic_task [None req-e6eddefe-3af0-447a-ae1d-81c274a86beb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 10 10:40:11 compute-0 nova_compute[186989]: 2025-12-10 10:40:11.983 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:12 compute-0 ovs-vsctl[225605]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 10 10:40:12 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 10 10:40:13 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 10 10:40:13 compute-0 virtqemud[186713]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 10 10:40:14 compute-0 nova_compute[186989]: 2025-12-10 10:40:14.019 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:14 compute-0 crontab[226017]: (root) LIST (root)
Dec 10 10:40:16 compute-0 systemd[1]: Starting Hostname Service...
Dec 10 10:40:16 compute-0 systemd[1]: Started Hostname Service.
Dec 10 10:40:16 compute-0 nova_compute[186989]: 2025-12-10 10:40:16.987 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 10 10:40:17 compute-0 podman[226273]: 2025-12-10 10:40:17.837589724 +0000 UTC m=+0.067787653 container health_status 3842ef224030889c5997245531dba4422af6086d61cc8bddd5147210088978f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 10 10:40:17 compute-0 podman[226274]: 2025-12-10 10:40:17.845229344 +0000 UTC m=+0.066292363 container health_status ed6051cf24b1398025eadf1f34d0774ee714df3f1953551cb6d5889a32483af9 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 10 10:40:19 compute-0 nova_compute[186989]: 2025-12-10 10:40:19.021 186993 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
